A modern LLM chat interface with MCP (Model Context Protocol) integration.
This App is still under development. Not all features work well.
- Multi-LLM Support: OpenAI GPT, Anthropic Claude, Google Gemini
- MCP Integration: Connect to multiple MCP servers for tools and data sources
- Real-time Communication: WebSocket-based chat interface
- Custom UI: MCP servers can modify the UI with custom HTML
- Authorization: Group-based access control for MCP servers
- Modern Stack: React frontend, FastAPI backend, Docker support
docker build -t atlas-ui-3 .
docker run -p 8000:8000 atlas-ui-3Important: This project uses uv as the Python package manager.
# Install uv if needed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Setup environment
uv venv && source .venv/bin/activate # Windows: .venv\Scripts\activate
uv pip install -r requirements.txt
# Configure
cp .env.example .env # Edit with your API keys
# Build frontend
cd frontend && npm install && npm run build
# there is a mock s3 that you might want to enable. Switching to minio sooon.
cd mocks/s3-mocks && python main.py
# Start backend
cd ../backend && python main.py
# OR the quickest way to start is to use the agent_start.sh
bash agent_start.sh
- Backend: FastAPI + WebSockets
- Frontend: React + Vite + Tailwind CSS
- Python Package Manager: uv (not pip!)
- Configuration: Pydantic with type safety
- Containerization: Docker
- Use
uvfor Python package management, not pip or conda - Don't use
uvicorn --reload- causes problems in development - Use
npm run buildinstead ofnpm run devfor frontend development - File limit: Maximum 400 lines per file for maintainability
- Container Environment: Use Fedora latest for Docker containers (GitHub Actions uses Ubuntu runners)
Copyright 2025 National Technology & Engineering Solutions of Sandia, LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights in this software
MIT License
