Skip to content

WIP: Add multi-LLM provider support and latest server changes#1

Draft
SamOwens1 wants to merge 1 commit into
mainfrom
feature/multi-llm-provider-and-chat-endpoint
Draft

WIP: Add multi-LLM provider support and latest server changes#1
SamOwens1 wants to merge 1 commit into
mainfrom
feature/multi-llm-provider-and-chat-endpoint

Conversation

@SamOwens1
Copy link
Copy Markdown

Introduce a provider abstraction layer so the /chat HTTP endpoint can use Anthropic (default), OpenAI, or Google Gemini as the LLM backend. Provider is selected via LLM_PROVIDER env var or per-request.

New files:

  • providers/ package (base class, Anthropic, OpenAI, Gemini implementations)
  • providers/tool_converter.py for Anthropic → OpenAI/Gemini schema conversion
  • anthropic_tools.json tool definitions
  • docs_search.py documentation search engine
  • mcp_server_instructions.md AI behaviour guidelines
  • patchworks-knowledge-base.md bundled product documentation

Modified files:

  • server.py — /chat endpoint delegates to configured provider
  • patchworks_client.py — latest Patchworks API client changes
  • pyproject.toml — added openai and google-genai dependencies
  • .env.example — added LLM provider configuration variables

No changes to MCP tools, stdio transport, or existing Anthropic behaviour.

Introduce a provider abstraction layer so the /chat HTTP endpoint can
use Anthropic (default), OpenAI, or Google Gemini as the LLM backend.
Provider is selected via LLM_PROVIDER env var or per-request.

New files:
- providers/ package (base class, Anthropic, OpenAI, Gemini implementations)
- providers/tool_converter.py for Anthropic → OpenAI/Gemini schema conversion
- anthropic_tools.json tool definitions
- docs_search.py documentation search engine
- mcp_server_instructions.md AI behaviour guidelines
- patchworks-knowledge-base.md bundled product documentation

Modified files:
- server.py — /chat endpoint delegates to configured provider
- patchworks_client.py — latest Patchworks API client changes
- pyproject.toml — added openai and google-genai dependencies
- .env.example — added LLM provider configuration variables

No changes to MCP tools, stdio transport, or existing Anthropic behaviour.
@SamOwens1 SamOwens1 marked this pull request as draft February 27, 2026 15:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant