-
Notifications
You must be signed in to change notification settings - Fork 454
Remove plural types and aliases for native pytorch types #677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
ensure_tuple() to work with all Sequence types
|
There's some additional typing stuff will need to be adjust in |
composer/utils/iter_helpers.py
Outdated
|
|
||
|
|
||
| def ensure_tuple(x): | ||
| def ensure_tuple(x) -> Tuple[Any, ...]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
iter_helpers.py is the exception to the rule and does not contain any type annotations itself. Instead, iter_helpers.pyi contains all the type annotations for iter_helpers.py
composer/utils/iter_helpers.py
Outdated
| if isinstance(x, tuple): | ||
| return x | ||
| if isinstance(x, list): | ||
| if isinstance(x, (tuple, list, range)): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| if isinstance(x, (tuple, list, range)): | |
| if isinstance(x, (tuple, list, range, collections.abc.Sequence)): |
Good point; this will need to be clarified in the docs somewhere. At a high level, |
…/composer into landan/fix_ensure_tuple
|
I think I addressed most comments, but I'm not very familiar with interfaces so it's likely a set that up wrong 😅. I will start making tests now I have a couple of docstring consistency questions that maybe should be addressed somewhere else:
|
…/composer into landan/fix_ensure_tuple
|
Errr, there are a lot of The second line gets |
It depends on what is being referenced; sometimes,
Good point; let's go with the torch types. I'd eventually like to remove types.py or make it significantly leaner.
Yes, let's do the "singleton or sequence" approach.
Yup! |
It shouldn't need a manual cast; double check the type annotation for |
|
@Landanjs I pushed a commit that (hopefully?) fixes the type errors with the |
|
The recent change tries to:
I did not convert Callbacks and Loggers to "singleton or sequence" format since these are converted into lists instead of tuples and I'm not confident the functionality is the same? But maybe should be changed in the future |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good! Main feedback is to remove the plural types from types.py (sorry I didn't mention this before) and the rest of the codebase (so we'll specify Unions in the type annotations, where necessary). Then I think it should be good to merge!
composer/core/types.py
Outdated
| initialize optimizers. | ||
| Tensor (torch.Tensor): Alias for :class:`torch.Tensor`. | ||
| Tensors (Tensor | Tuple[Tensor, ...] | List[Tensor]): Commonly used to represent e.g. a set of inputs, | ||
| Tensors (torch.Tensor | Sequence[torch.Tensor]): Commonly used to represent e.g. a set of inputs, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you remove this docstring (and the associated type alias below in the code)?
composer/core/types.py
Outdated
| Metrics (Metric | MetricCollection): Union type covering common formats for representing metrics. | ||
| Optimizer (torch.optim.Optimizer): Alias for :class:`torch.optim.Optimizer` | ||
| Optimizers (Optimizer | List[Optimizer] | Tuple[Optimizer, ...]): Union type for indeterminate amounts of optimizers. | ||
| Optimizers (torch.optim.Optimizer | Sequence[torch.optim.Optimizer]): Union type for indeterminate amounts of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above -- can you remove this docstring (and the associated type alias below in the code)?
composer/core/types.py
Outdated
| as :class:`torch.optim.lr_scheduler.ConstantLR` | ||
| Scaler (torch.cuda.amp.grad_scaler.GradScaler): Alias for :class:`torch.cuda.amp.GradScaler`. | ||
| JSON (str | float | int | None | List['JSON'] | Dict[str, 'JSON']): JSON Data | ||
| Evaluators (Many[Evaluator]): Union type for indeterminate amounts of evaluators. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above -- can you remove this docstring (and the associated type alias below in the code)?
composer/core/types.py
Outdated
|
|
||
| T = TypeVar('T') | ||
| Many = Union[T, Tuple[T, ...], List[T]] | ||
| Many = Union[T, Sequence[T]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's remove Many too
composer/core/types.py
Outdated
|
|
||
| Tensor = torch.Tensor | ||
| Tensors = Many[Tensor] | ||
| Tensors = Union[Tensor, Sequence[Tensor]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same thing...removing the aliases here will ensure that we removed them throughout the entire codebase, and use the underlying Union[X, Sequence[X]] notation in the codebase throughout.
composer/trainer/trainer.py
Outdated
| .. seealso:: :mod:`composer.optim` for the different optimizers built into Composer. | ||
| schedulers (Schedulers, optional): The learning rate schedulers. If ``[]`` or ``None``, will be set to | ||
| ``[constant_scheduler]``. (default: ``None``). | ||
| schedulers (types.PyTorchScheduler or Sequence[types.PyTorchScheduler], optional): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The trainer takes both pytorch schedulers or composer schedulers.
| schedulers (types.PyTorchScheduler or Sequence[types.PyTorchScheduler], optional): | |
| schedulers (types.ComposerScheduler | types.PyTorchScheduler | Sequence[types.ComposerScheduler | types.PyTorchScheduler], optional): |
composer/trainer/trainer.py
Outdated
| schedulers (Schedulers, optional): The learning rate schedulers. If ``[]`` or ``None``, will be set to | ||
| ``[constant_scheduler]``. (default: ``None``). | ||
| schedulers (types.PyTorchScheduler or Sequence[types.PyTorchScheduler], optional): | ||
| The learning rate schedulers. If ``[]`` or ``None``, will be set to ``[constant_scheduler]``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| The learning rate schedulers. If ``[]`` or ``None``, will be set to ``[constant_scheduler]``. | |
| The learning rate schedulers. If ``[]`` or ``None``, the learning rate will be constant. |
|
There appear to be a few docstring formatting warnings that need to be fixed (Docstring warnings are treated as errors and cause the build to fail). |
Remove re-exported types from types.py. This should help with code readability, as now the underlying type is directly imported.
`Dict[str, Any]` is almost as short but clearer on what it is (a dictionary with string keys).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
ensure_tuple() to work with all Sequence types
I may have misunderstood the assignment, but attempting to address #640 and #510.
Essentially:
rangeobjects inensure_tuple()since these were the last type ofSequencewe did not support.T | Tuple[T] | List[T]inManytoT | Sequence[T]T | Tuple | ListtoManyIf this actually looks right, I will add tests for
ensure_tupleand every sequence type.Side note: I'm a little confused when to use
PytorchSchedulersvsComposerSchedulers