Skip to content

Migrate AttentionOp to use NNX #1867

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 24, 2025
Merged

Migrate AttentionOp to use NNX #1867

merged 1 commit into from
Jun 24, 2025

Conversation

bvandermoon
Copy link
Collaborator

Description

Update the AttentionOp module to use NNX instead of Linen.

AttentionOp has no dependencies/params. Attention and Gpt3MultiHeadAttention both depend on it.

Tests

Successfully trained MaxText base on a TPU VM for 10 steps:

python3 -m MaxText.train MaxText/configs/base.yml \
    run_name=<run_name> \
    base_output_directory=gs://<gcs_bucket> \
    dataset_type=synthetic \
    steps=10

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed.

@bvandermoon bvandermoon force-pushed the bvandermoon-nnx-dev branch from 4a56ba2 to 6d021dc Compare June 24, 2025 01:57
Copy link
Collaborator

@richjames0 richjames0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm with thanks

@copybara-service copybara-service bot merged commit 35f3325 into main Jun 24, 2025
19 checks passed
@copybara-service copybara-service bot deleted the bvandermoon-nnx-dev branch June 24, 2025 17:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants