Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Add inferenceProvider filter when listing models #1198

Conversation

frascuchon
Copy link
Contributor

This PR adds the missing query param inference_provider when listing models (listModels).

Copy link
Member

@julien-c julien-c left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's wait for a @SBrandeis review to make sure the API is final

@Wauplin
Copy link
Contributor

Wauplin commented Feb 12, 2025

(following this PR to implement the same in huggingface_hub once settled)

@@ -63,6 +63,7 @@ export async function* listModels<
owner?: string;
task?: PipelineType;
tags?: string[];
inferenceProvider?: string;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can search for several providers at once

Suggested change
inferenceProvider?: string;
inferenceProvider?: string[];

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Passing more than one provider raises an error:

Error: Internal Error - We're working hard to fix this as soon as possible!. URL: https://huggingface.co/api/models?limit=10&expand=pipeline_tag&expand=private&expand=gated&expand=downloads&expand=likes&expand=lastModified&expand=inferenceProviderMapping&inference_provider=together&inference_provider=replicate. Request ID: Root=1-67ae03c6-377f36e3001096932c7699a4

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this URL it works:
https://huggingface.co/api/models?limit=10&expand=pipeline_tag&expand=private&expand=gated&expand=downloads&expand=likes&expand=lastModified&expand=inferenceProviderMapping&inference_provider=together,replicate

inference_provider=together,replicate instead of inference_provider=together&inference_provider=replicate

That being said - this should not end up in a Server Error 😵 💫

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh cool. Thanks!. I thought the query param had the same syntax as the expand one

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And the filter terms are combined with an OR clause, right? I mean inference_provider=together,replicate returns models with one of those providers.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes it's an OR

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay. I think it's ready.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @frascuchon !

@SBrandeis SBrandeis merged commit 4f04904 into huggingface:main Feb 14, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants