Support for more LLMs? Soon! #246
RamiAwar
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Update September, 2024: We've investigated this, and it seems like none of the local models were good enough to use with tool calling and DataLine's specific toolkit. Since this was unusable, we decided not to continue working on it as there is no point. We'll revisit this every 3 months and test again.
As for custom provider support, we also weren't too happy with other providers. Got a lot of structured output issues, tool calling issues, different API formats, etc. We've decided not to invest more of our time in this for now. Again, revisit in around 3 months. Hard to stay up to date with everything without wasting all of your time, but it seems like the APIs are converging to the OpenAI api, albeit slowly.
Update August, 2024: Work has begun! #267
Phase 1 🚧
@anthony2261 is working on adding a custom LLMProvider architecture to replace what we have now. This will later be populated from the frontend, allowing custom base URLs.
Phase 2 📝
Support LLMProvider configuration from settings page
Phase 3 📝
Add more LLMProvider implementations (Anthropic, Ollama, etc.) supporting more models as well as local models
Beta Was this translation helpful? Give feedback.
All reactions