Replies: 1 comment
-
@imchinfei yes, you need to modify the .sh to include the master/ip info, you can find distributed launch script examples in pytorch code / docs... |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
distributed_train.sh seems to support single-machine multi-card training, does this project support multi-machine multi-card training?
Beta Was this translation helpful? Give feedback.
All reactions