ZGI is an intuitive llm platform that supports multiple LLM providers, integrates various plugins, and provides a personalized AI assistant experience.
- 🧠 Multi-Model Integration: Seamlessly connect with a variety of language models, including cloud-based options like OpenAI and local models such as LLaMA.
- 🔌 Plugin Ecosystem: Enhance platform capabilities with a wide range of third-party plugins, including function calling for advanced interactions.
- 📄 RAG-Enhanced Retrieval: Interact with various file formats like PDF, Markdown, JSON, Word, Excel, and images to build a powerful information retrieval system.
- 🤖 Custom AI Agents: Create and tailor AI agents for specific tasks, providing solutions perfectly suited to your needs.
- 🗣️ Text-to-Speech: Convert AI-generated text into speech for a hands-free experience.
- 🎙️ Speech-to-Text (Coming Soon): Use voice input to interact with AI naturally and efficiently.
- 💾 Local Storage: Securely store data locally using in-browser IndexedDB, ensuring privacy and faster access.
- 📤📥 Easy Import/Export: Effortlessly move documents with robust data portability for smooth migration and backups.
- 📚 Knowledge Spaces (Coming Soon): Build custom knowledge bases to store and access information tailored to your interests.
- 👤 Personalization: Utilize the memory plugin for more contextual, personalized AI responses that adapt to your unique workflow.
- 🎙️ Chat with PDF: Coming soon.
- 📚 Knowledge Spaces: Coming soon.
- Ensure you have yarn or bun installed.
- Clone the repository:
git clone https://github.com/zgiai/zgi.git cd zgi
- Install dependencies:
yarn install # or bun install
- Start the development server:
yarn dev # or bun dev
- Open your browser and navigate to: http://localhost:3000.
- Detailed instructions for deployment will be added soon. For now, you can use popular platforms like Vercel or Netlify to deploy the app.