Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

byebye torch 2.0 #37277

Merged
merged 15 commits into from
Apr 7, 2025
Merged

byebye torch 2.0 #37277

merged 15 commits into from
Apr 7, 2025

Conversation

ydshieh
Copy link
Collaborator

@ydshieh ydshieh commented Apr 4, 2025

What does this PR do?

For #37137

Fixes #37238

@github-actions github-actions bot marked this pull request as draft April 4, 2025 12:38
Copy link

github-actions bot commented Apr 4, 2025

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. The CI will be paused while the PR is in draft mode. When it is ready for review, please click the Ready for review button (at the bottom of the PR page). This will assign reviewers and trigger CI.

@ydshieh ydshieh marked this pull request as ready for review April 4, 2025 12:38
@github-actions github-actions bot requested review from amyeroberts and qubvel April 4, 2025 12:39
@ydshieh ydshieh requested review from Rocketknight1 and removed request for amyeroberts and qubvel April 4, 2025 12:40
@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 4, 2025

@SunMarc @muellerzr You may want to double check the changes in trainer / fsdp / training args / quantizer

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 4, 2025

not urgent, but @Rocketknight1 this PR is ready. The failing tests are irrelevant.

Copy link
Contributor

@cyyever cyyever left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PT 2.1 was released in 2023. In 2025 it's safe to require it as lower limit.

@@ -91,7 +91,7 @@
"tiktoken": "tiktoken",
"timm": "timm<=1.0.11",
"tokenizers": "tokenizers>=0.21,<0.22",
"torch": "torch>=2.0",
"torch": "torch>=2.1",
Copy link
Contributor

@cyyever cyyever Apr 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we specify torch>=2.1.2

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain the reason? Usually we are pin major/minor version, but not patch version. We can do that if there is good reason of course.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because is_torch_sdpa_available returns True on pt>=2.1.2

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Given that most people are likely using newer torch versions, such specific pin (to the patch version) isn't really necessary IMO.

IIRC, when we need to pin such patch version(s), the reasons were earlier patch version(s) have serious issues and make several models not useable. One example I remembered was audio models failed at some point.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will merge as it is. If we find such pin is necesssary, we can update in the future.

Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating ! Just a few nits

 into bump/torch

# Please enter a commit message to explain why this merge is necessary,
# especially if it merges an updated upstream into a topic branch.
#
# Lines starting with '#' will be ignored, and an empty message aborts
# the commit.
Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this LGTM! As long as we keep the function for backward compatibility with remote code I think this is pretty safe, and cleans things up a lot!

# Conflicts:
#	src/transformers/modeling_utils.py
@ydshieh
Copy link
Collaborator Author

ydshieh commented Apr 7, 2025

Going to merge, the failing tests

tests/models/llama4/test_processor_llama4.py

are irrelevant to this PR and known already failing on main.

@ydshieh ydshieh merged commit e7ad077 into main Apr 7, 2025
19 of 21 checks passed
@ydshieh ydshieh deleted the bump/torch branch April 7, 2025 13:19
vasqu pushed a commit to vasqu/transformers that referenced this pull request Apr 7, 2025
* bump Torch 2.1 with broken compatibility `torch.compile`

* dep table

* remove usage of is_torch_greater_or_equal_than_2_1

* remove usage of is_torch_greater_or_equal_than_2_1

* remove if is_torch_greater_or_equal("2.1.0")

* remove torch >= "2.1.0"

* deal with 2.0.0

* PyTorch 2.0+ --> PyTorch 2.1+

* ruff 1

* difficult ruff

* address comment

* address comment

---------

Co-authored-by: Jirka B <[email protected]>
Co-authored-by: ydshieh <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

transformers 4.50 does not work with pytorch 2.0
6 participants