-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🚨🚨🚨 The Great Deprecation 🚨🚨🚨 #3098
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+4 −247
🤩
Looks good, thanks for addressing these (sometimes long overdue) deprecations.
I also did a check on the code base to see if anything was missing and found these potential candidates, WDYT?
accelerate/src/accelerate/utils/modeling.py
Line 562 in d5b7b70
# TODO: at the next Transformers release (4.28.0) issue a deprecation warning here. - https://github.com/huggingface/accelerate/blob/main/src/accelerate/memory_utils.py
- **TPU** -- This field will be deprecated in v0.27.0. Use XLA instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for taking care of that @muellerzr ! LGTM !
What does this PR do?
This PR applies all deprecations for 1.0.0:
DataLoaderConfiguration
to set items likeeven_batches
,use_seedable_sampler
, etcuse_fp16
properties are now goneAccelerator.autocast
no longer accepts acache_enabled
arg, users must make anAutocastKwargs
instance and pass it in as a kwarg handlerfsdp_backward_prefetch_policy
is replaced withfsdp_backward_prefetch
AcceleratedOptimizer.is_overflow
should be replaced withAcceleratedOptimizer.step_was_skipped
is_tpu_available
should be replaced withis_torch_xla_available
ACCELERATE_DISABLE_RICH
is no longer a valid env variable, one should manually enablerich
traceback viaACCELERATE_ENABLE_RICH
utils.modeling.shard_checkpoint
should be replaced withsplit_torch_state_dict_into_shards
from thehuggingface_hub
librarytqdm
wrapper no longer takesTrue/False
as the first argument, one should manually pass inmain_process_only
as a kwarg insteadBefore submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@BenjaminBossan
@SunMarc