-
Notifications
You must be signed in to change notification settings - Fork 32
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Feature request
Is your feature request related to a problem? Please describe.
I'd like to be able to use a local model with Ollama.
Describe the solution you'd like
I want to be able to use a local Ollama instance with seed.
Describe alternatives you've considered
Cloud based models require an Internet connection as well as an account with a third party.
It would be possible to use something like LocalAI which provides an OpenAI-compatible API for local models, but this is a pretty heavyweight solution.
Additional context
LangChain.js already has Ollama support.
It seems like this could be as simple as allowing an OLLAMA_MODEL
environment variable that can be used to construct a ChatOllama
instance.
ATLAS2002, dylxc and simmorsaldylxcBRAVO68WEB
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request