Skip to content

Conversation

nWEIdia
Copy link
Collaborator

@nWEIdia nWEIdia commented May 8, 2024

and make error message more clear when apex.transformer is explicitly called on unsupported platform

In some configurations, pytorch may not be built with distributed support.
So we avoid the auto import of transformer, which calls torch.distributed api explicitly.

…s more clear when apex.transformer is explicitly called on unsupported platform
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also, remove "transformer" from the dunder all

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, "the dunder all" means? :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

meaning

__all__ = ["amp", "fp16_utils", "optimizers", "normalization", "transformer"]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants