Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama integration broken with latest litellm version #3538

Open
heaven00 opened this issue Mar 14, 2025 · 3 comments
Open

Ollama integration broken with latest litellm version #3538

heaven00 opened this issue Mar 14, 2025 · 3 comments

Comments

@heaven00
Copy link

heaven00 commented Mar 14, 2025

Issue

The latest version of litellm breaks integration with Ollama BerriAI/litellm#9224 we can possibly wait for the fix to go into litellm and upgrade again.

or if possible can we fix the litellm version to v1.63.2-stable? it did not had the commit currently breaking ollama integration via litellm

or alternatively is there a downgrade documentation for going to an older version of aider?

Version and model info

Aider v0.77
Model: ollama/qwq:32b
edit format: full

@heaven00
Copy link
Author

heaven00 commented Mar 14, 2025

or alternatively is there a downgrade documentation for going to an older version of aider?

uv tool install --force --python python3.12 [email protected] temporary fix for anyone that ends up with this issue

@gregid
Copy link

gregid commented Mar 16, 2025

It looks like https://github.com/BerriAI/litellm/releases/tag/v1.63.11-stable release has already implemented the fix

Fix "system" role has become unacceptable in ollama by @briandevvn in BerriAI/litellm#9261

@heaven00
Copy link
Author

Thank you, will update with the latest version

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants