-
Notifications
You must be signed in to change notification settings - Fork 366
All Gather Once (FSDP Zero-one) #1865
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
MaxText/configs/base.yml
Outdated
@@ -186,6 +186,9 @@ pipeline_delay_activation_forwarding: False # This delays the activation forward | |||
# the communication and compute in each iteration are now independent. However this comes at the cost of doubling the pipeline bubble, | |||
# and you must set the number of microbatches to at least 2 * num_stages (the minimum 2 * num_stages is set by default with this delay). | |||
|
|||
model_fsdp_ag_once: False # This controls whether the Zero-1 optimization is active. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how does this interplay with pipeline_fsdp_ag_once? If you set the model one to true, what happens when we set pipeline_fsdp_ag_once to true/false?
MaxText/layers/models.py
Outdated
The goal of this optimization is to reduce communication overhead. In the standard | ||
FSDP implementation, an all-gather operation on the model weights is performed twice | ||
for each microbatch (once for the forward pass, once for the backward pass). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
microbatch -> gradient accumulation microbatch
Can we also add a unit test (integration test) with this feature on in train_tests in your PR description you can link the profiles before and after (ideally with pdb=1 to make the feature more obvious) |
Description
Start with a short description of what the PR does and how this is a change from
the past.
The rest of the description includes relevant details and context, examples:
If the change fixes a bug or a Github issue, please include a link, e.g.,:
FIXES: b/123456
FIXES: #123456
Notice 1: Once all tests pass, the "pull ready" label will automatically be assigned.
This label is used for administrative purposes. Please do not add it manually.
Notice 2: For external contributions, our settings currently require an approval from a MaxText maintainer to trigger CI tests.
Tests
Please describe how you tested this change, and include any instructions and/or
commands to reproduce.
Checklist
Before submitting this PR, please make sure (put X in square brackets):