Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve whoami() error messages by specifying token source #2814

Merged
merged 20 commits into from
Feb 5, 2025

Conversation

aniketqw
Copy link
Contributor

This PR enhances the error messages in whoami() by indicating the source of the invalid token. Currently, when authentication fails, users receive a generic error message regardless of where their token came from (parameter, environment variable, file, or Colab). This can be confusing, especially in cases where an outdated HF_TOKEN environment variable takes precedence over a valid token file.

Changes:

  • Add _get_token_source() helper to detect token origin
  • Add _get_token_error_message() to generate source-specific messages
  • Update whoami() to use these helpers for error handling

Example improvements:

  • For env variables: "The token from HF_TOKEN environment variable is invalid. Note that HF_TOKEN takes precedence over the token file. Either update HF_TOKEN or unset it to use the token from huggingface-cli login."
  • For file tokens: "The token stored in ~/.cache/huggingface/token is invalid. Please run huggingface-cli login to update it."
  • For Colab: "The token from Google Colab vault is invalid. Please run huggingface-cli login in your Colab notebook to update it."

This change is backward compatible and only affects error messages.
Fixes #2809

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @aniketqw , thanks for taking care of this issue that quickly! The PR is clean but I've left some suggestion to simplify it + make it more robust. Let me know if you have any questions :)

src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
Copy link
Contributor Author

@aniketqw aniketqw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have added the new changes

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @aniketqw ! Just a round of clean-up and we should be good to merge :)

src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
if effective_token == _get_token_from_google_colab():
error_message += (
" The token from Google Colab vault is invalid. "
"Please run `huggingface-cli login` in your Colab notebook to update it."
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"Please run `huggingface-cli login` in your Colab notebook to update it."
"Please update it from the UI."

src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
src/huggingface_hub/hf_api.py Outdated Show resolved Hide resolved
@@ -1616,37 +1616,54 @@ def run_as_future(self, fn: Callable[..., R], *args, **kwargs) -> Future[R]:
return self._thread_pool.submit(fn, *args, **kwargs)

@validate_hf_hub_args
from huggingface_hub.utils._auth import _get_token_from_google_colab, _get_token_from_environment, _get_token_from_file
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you move this line to the imports section ? And run make style + make quality locally to make sure the code styling aligns with the standard of this repo.

@@ -1614,39 +1614,50 @@ def run_as_future(self, fn: Callable[..., R], *args, **kwargs) -> Future[R]:
self._thread_pool = ThreadPoolExecutor(max_workers=1)
self._thread_pool
return self._thread_pool.submit(fn, *args, **kwargs)


from huggingface_hub.utils._auth import _get_token_from_google_colab, _get_token_from_environment, _get_token_from_file
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

must be moved to the import section (around

from .utils.endpoint_helpers import _is_emission_within_threshold
). And run make style after to make sure imports are correctly sorted

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just did the changes

Copy link
Contributor

@Wauplin Wauplin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @aniketqw! Much cleaner now, I think we should be good to merge :) I push a commit to remove an extra line + sort imports. I'll merge if CI is green :)

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@Wauplin
Copy link
Contributor

Wauplin commented Feb 5, 2025

(sorry for the mess, should be good now 😬)

@Wauplin Wauplin merged commit d29290f into huggingface:main Feb 5, 2025
14 checks passed
@aniketqw
Copy link
Contributor Author

aniketqw commented Feb 5, 2025

Thanks it was wonderful seeing the informative guideline throughout . Can you suggest any other doable issue for me ? I am very new all this but I feel excited using huggingface library and want to contribute to it .

@aniketqw aniketqw deleted the whoami#2809 branch February 5, 2025 18:54
@aniketqw aniketqw restored the whoami#2809 branch February 5, 2025 18:54
@aniketqw aniketqw deleted the whoami#2809 branch February 5, 2025 18:54
@Wauplin
Copy link
Contributor

Wauplin commented Feb 6, 2025

Hi @aniketqw thanks for your proposition! Actually at the moment we don't have many "good first issues" to contribute to (see here). Maybe you'd like to play with #1065 for instance ? (ie. adding one option to the delete-cache command? maybe --keep-last?). Let me know what you think and let's discuss it on the other issue if needed.

@aniketqw
Copy link
Contributor Author

aniketqw commented Feb 6, 2025

hi @Wauplin , thanks for the guidance I will try to see #1065 ,I understood the gist of it . I will try to see what I can do. Can you please provide me with some reliable sources that I can refer to for guidance on solving this problem? I will get back to you hopefully with my solution on the other issue. Thanks once again.

@Wauplin
Copy link
Contributor

Wauplin commented Feb 6, 2025

@aniketqw here is a PR adding the --sort parameter for instance: #2815 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Better error message if invalid token on whoami
3 participants