-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
We would like to improve the experience of running chat-ui locally and reduce friction in the setup.
Current pain points:
- Having to setup your DB
- Having to point to an inference server
- Having to deal with either docker or git + npm
The goal is to reduce the friction with the following steps:
- Make the DB optional, use an in-process mongo server if it's not set (feat(db): use in memory db when MONGODB_URL not set #1773)
- Make it possible to download and run models directly from chat-ui (with node-llama-cpp?) feat: add a local endpoint type for inference directly from chat-ui #1778
- Allow admin login from CLI à la jupyter notebook (feat: admin CLI login #1789 )
- Add an onboarding UI for setting up new models from the app
- Provide an easy setup like
npx @huggingface/chat-uiorpip - Update the README so it focuses on quickstarting and move the rest of the doc to the docs page
julien-c, gary149 and arch-btwarch-btwarch-btwarch-btw
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request