-
Notifications
You must be signed in to change notification settings - Fork 358
Open
Description
Is your feature request related to a problem? Please describe.
The AP2 project primarily supports the "gemini-2.5-flash" model. This is evident from several Python agent definitions in the repo (such as RetryingLlmAgent and usage in root_agent and subagents), where "model="gemini-2.5-flash"" is specified as the LLM in use. The code constructs the LLM client with genai.Client() and makes content requests using the Gemini API.
Describe the solution you'd like
I'd like for the AP2 protocol implementation reference to be LLM agnostic.
Describe alternatives you've considered
No response
Additional context
No response
Code of Conduct
- I agree to follow this project's Code of Conduct
fit-jose
Metadata
Metadata
Assignees
Labels
No labels