Skip to content

[llama4][auxiliary-loss-free load balancing] update expert_bias without backward hooks #1304

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

hann-wang
Copy link
Contributor

Changes:

Reasons:

  • Friendly for torch.compile and activation checkpointing.
  • The original implementation updates expert_bias on each microbatches during gradient accumulation.

cc @tianyu-l

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Jun 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants