watch the live demo here
this project using the power of LLMs with the approach of Agents to provide an accurate and effective answer to users' queries.
- a Chat interface for interacting with the agent
- the agent has the ability to search the jewish library, read a specific text or get commentaries for a specific verse
- Flexible LLM provider support (Claude, GPT, Ollama)
- Python 3.11 or higher
- At least one of the following API keys (stored in
.env):- Anthropic API key (for Claude)
- OpenAI API key (for GPT)
- Google API key (for Gemini) Or a Local Ollama setup (for open-source models).
- Clone the repository
- Install dependencies:
pip install -r requirements.txt- Set up a
.envfile with your credentials:
ANTHROPIC_API_KEY=your_anthropic_api_key
OPENAI_API_KEY=your_openai_api_key
GOOGLE_API_KEY=your_google_api_key
configure the index path:
INDEX_PATH=path/to/your/index
Run the Streamlit UI to see the system in action:
streamlit run app.pyThe system uses a reason and act (ReAct) architecture, to achive an aoutonomous agent.
The agent can use the search engine to find relevant passages in the Jewish Library, and use those passages to answer user queries.
Key dependencies include:
langchainand related packages for LLM integrationstreamlitfor the user interfacetantivyfor document indexing and searchpython-dotenvfor environment management- Various LLM provider packages (anthropic, openai, ollama)
- Modify
tantivy_search.pyto customize search settings - Modify
sefaria.pyto configure Sefaria API access - Adjust
agent.pyto configure agent behavior - Update
llm_providers.pyto add or modify LLM providers - Customize
app.pyfor UI modifications
- Store API keys and sensitive data in environment variables
- Never upload your
.envfile to version control - Ensure proper access controls for the Tantivy index
- Verify the Tantivy index exists and is accessible
- Check API key permissions for your chosen LLM provider
- Ensure proper file permissions for the document library
- Review logs for detailed error information
Contributions are welcome! Please feel free to submit pull requests or open issues for bugs and feature requests.
MIT
