Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S16 - More models support + Local LLMs #267

Open
8 tasks
RamiAwar opened this issue Jul 19, 2024 · 1 comment
Open
8 tasks

S16 - More models support + Local LLMs #267

RamiAwar opened this issue Jul 19, 2024 · 1 comment
Assignees
Labels

Comments

@RamiAwar
Copy link
Owner

RamiAwar commented Jul 19, 2024

Implementation

  • Look into LiteLLM (OpenAI, Anthropic, OpenAI Azure support) + Ollama
  • Add LLM connection setup to user settings + global default
  • Replace OpenAI setup popup by LLM provider setup
  • Set default LLM provider on convo create
  • Add LLM provider selection drop down in conversations based on already setup providers + add more button (so people realize they can add more)
  • Build an abstraction that takes as input an LLM provider (that's already setup)
  • Add the current active LLM for current chat to QueryOptions and populate it in the conversation service
  • Use the LLM provider in QueryOptions instead of the current OpenAI-specific implementation in the query graph

How do we manage which model is used in a new conversation?

  • Global default -> set in user settings
  • On creating new conversation, set the LLM to the global default
  • User can change LLM per chat (dropdown like ChatGPT)
  • Save LLM used per chat

What will the model abstraction look like?

 model = state.options.get_model()
 model.bind_tools(tools)
 model.invoke(messages)
@RamiAwar
Copy link
Owner Author

Update: Let's do this later maybe...

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants