-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a Heat aware DistributedSampler for torch usage. #1807
base: main
Are you sure you want to change the base?
Add a Heat aware DistributedSampler for torch usage. #1807
Conversation
… on a process local basis.
for more information, see https://pre-commit.ci
Thank you for the PR! |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1807 +/- ##
==========================================
- Coverage 92.26% 91.62% -0.64%
==========================================
Files 84 84
Lines 12447 12554 +107
==========================================
+ Hits 11484 11503 +19
- Misses 963 1051 +88
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Thank you for the PR! |
1 similar comment
Thank you for the PR! |
Thank you for the PR! |
When using the normal comm.Bcast the Bcast only works the first time and the seconds time not anymore. ❯ mpirun -np 2 python test.py
0 tensor([2, 4, 3, 0, 1], dtype=torch.int32)
1 tensor([2, 4, 3, 0, 1], dtype=torch.int32)
...
1 tensor([455, 0, 0, 0, 32], dtype=torch.int32)
0 tensor([1, 4, 3, 2, 0], dtype=torch.int32) |
Due Diligence
Description
Issue/s resolved: #1789
Changes proposed:
Add a Heat aware DistributedSampler for usage for PyTorch use cases
Type of change
Does this change modify the behaviour of other functions? If so, which?
no