-
-
Notifications
You must be signed in to change notification settings - Fork 11.3k
[FEAT] [Multi-modal] Reorganizing ViT Attention Backend and decouple from Text Attention Backend #27919
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
… unittest Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
… from most implementation Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request is a great refactoring that decouples the ViT attention backend selection from the text attention backend and moves platform-specific logic to the platform interface. This significantly improves code organization and maintainability. I've identified a couple of critical issues that need to be addressed to ensure the new logic works as intended.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
|
@codex review |
|
Codex Review: Didn't find any major issues. 👍 ℹ️ About Codex in GitHubCodex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback". |
Signed-off-by: tjtanaa <[email protected]>
ywang96
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for working on this amazing effort! I will review this PR tonight!
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
|
@ywang96 @DarkLight1337 I have rebased it with upstream again. It contains the enablement of I would like to suggest the further enablement of self.flash_attn_varlen_func = maybe_get_vit_flash_attn_backend(
self.attn_backend,
)will be done in another PR after this one. |
Signed-off-by: tjtanaa <[email protected]>
Signed-off-by: tjtanaa <[email protected]>
jikunshang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
|
This pull request has merge conflicts that must be resolved before it can be |
Purpose
This is an implementation of the RFC #27821
1. It decouples the ViT Attention Backend selection logic from the Text Attention Backend.
Redefining
_MHA_Backendfollowing the discussion in https://github.com/vllm-project/vllm/pull/27061/files#r2443909604 .2. Make ViT Backend selection Platform specific information
Make sure that the ViT attention is a platform specific. We should determine the supported ViT backend through platform interface. Any overriding logic should be performed in the platform interface as only the platform interface knew what other ViT backend it can fallback to. We should avoid adding overriding logic to the
model.pyfiles.So, we allow
get_vit_attn_backendin the platform interface has to be able to access the--mm-encoder-attn-backend.In the platform interface, we should only return
_MHA_Backend, we should not return the functions. The flash attention functions should only be returned throughmaybe_get_vit_flash_attn_backend. If the model only supports a specific type of attention, the ViT overriding logic will be implemented explicitly in the model definition file (model.py).Test Plan
Add unit tests to tests:
Nonecurrent_platform.get_supported_vit_attn_backends()Setup unit tests for all affected model:
tests/models/multimodal/generation/test_ovis2_5.pytests/models/multimodal/generation/test_maverick.pytests/models/multimodal/generation/test_dots_ocr.pytests/models/multimodal/generation/test_keye.pytests/models/multimodal/generation/test_qwen2_5_vl.pytests/models/multimodal/generation/test_qwen2_vl.pytests/models/multimodal/generation/test_ovis2_5.pytests/models/multimodal/generation/test_qwen3_omni_moe_thinker.pytests/models/multimodal/generation/test_ernie45_vl.pytests/models/multimodal/generation/test_glm4_1v.pyTest Result
All unit tests passed
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.