Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Max Output Tokens doesn't work for Gemini #5461

Open
1 task done
KiGamji opened this issue Jan 25, 2025 · 3 comments
Open
1 task done

[Bug]: Max Output Tokens doesn't work for Gemini #5461

KiGamji opened this issue Jan 25, 2025 · 3 comments
Labels
bug Something isn't working

Comments

@KiGamji
Copy link
Contributor

KiGamji commented Jan 25, 2025

What happened?

When creating a new chat, you can see that the max output tokens field has 8192 by default instead of blank value, which is already weird:
Image

Setting it to a very small value shows that it doesn't work at all:
Image

Steps to Reproduce

.

What browsers are you seeing the problem on?

No response

Relevant log output

Screenshots

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@KiGamji KiGamji added the bug Something isn't working label Jan 25, 2025
@KiGamji
Copy link
Contributor Author

KiGamji commented Jan 25, 2025

Apparently, Max Context Tokens option doesn't work either.

@Originalimoc
Copy link

I tried direct API translation, it works:
https://github.com/Originalimoc/OpenAI_API_Adapter-Google
Access via IP:18788(--port)/v1/chat/completions, use with https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints

@danny-avila
Copy link
Owner

danny-avila commented Jan 26, 2025

max output tokens

Currently working for Vertex AI, fixing for Gemini API

Apparently, Max Context Tokens option doesn't work either.

Fixing along with #5466

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants