Skip to content

Conversation

@adlumal
Copy link

@adlumal adlumal commented Oct 24, 2025

Title

Add Isaacus embeddings provider with kanon-2-embedder support

Relevant issues

N/A - New provider integration

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
    • Added comprehensive tests in tests/llm_translation/test_isaacus_embedding.py with 14 unit tests and 6 e2e tests
  • I have added a screenshot of my new test passing locally
image
  • My PR passes all unit tests on make test-unit - Isaacus-specific and related tests pass, there are 3 unrelated tests that do not pass at the time of this PR on my machine
  • My PR's scope is as isolated as possible, it only solves 1 specific problem - Only adds Isaacus embeddings provider, no other modifications

Type

🆕 New Feature

Changes

This PR adds support for the Isaacus embeddings provider (kanon-2-embedder model) to LiteLLM.

Implementation:

  • Created litellm/llms/isaacus/embedding/transformation.py with IsaacusEmbeddingConfig class
  • Implements request transformation: OpenAI input parameter → Isaacus texts parameter
  • Implements response transformation: Isaacus response format → OpenAI-compatible format
  • Registered provider in core files: __init__.py, constants.py, main.py, types/utils.py, utils.py, get_llm_provider_logic.py

Features:

  • Standard OpenAI embeddings syntax compatibility (drop-in replacement)
  • Support for single text and batch embedding requests
  • Support for async embedding via aembedding()
  • Isaacus-specific optional parameters:
    • task - Optimize for "retrieval/query" or "retrieval/document"
    • dimensions - Optionally reduce embedding dimensionality
    • overflow_strategy - Handle text overflow with "drop_end"

Testing:

  • 14 unit tests with mocked responses covering:
    • Different input formats (string, list)
    • Optional parameters handling
    • Request/response transformation
  • 6 end-to-end tests covering:
    • Basic embedding
    • Batch embedding
    • Optional parameters
    • Async embedding

Documentation:

  • Added Isaacus section to docs/my-website/docs/embedding/supported_embedding.md
  • Included usage examples and parameter descriptions

Files changed:

  • litellm/llms/isaacus/__init__.py (new)
  • litellm/llms/isaacus/embedding/__init__.py (new)
  • litellm/llms/isaacus/embedding/transformation.py (new)
  • litellm/__init__.py (added isaacus_key, imported IsaacusEmbeddingConfig)
  • litellm/constants.py (registered isaacus provider)
  • litellm/main.py (added isaacus routing logic)
  • litellm/types/utils.py (added ISAACUS to LlmProviders enum)
  • litellm/utils.py (registered IsaacusEmbeddingConfig in ProviderConfigManager)
  • litellm/litellm_core_utils/get_llm_provider_logic.py (added isaacus provider detection)
  • tests/llm_translation/test_isaacus_embedding.py (new - 14 unit tests, 6 e2e tests requiring API key, skipped)
  • tests/test_litellm/llms/isaacus/embedding/__init__.py (new)
  • tests/test_litellm/llms/isaacus/embedding/test_transformation.py (new)
  • docs/my-website/docs/embedding/supported_embedding.md (added Isaacus documentation)

Example usage:

import litellm

response = litellm.embedding(
    model="isaacus/kanon-2-embedder",
    input="Legal text to embed",
    api_key="your-isaacus-api-key"
)

This is my first time contributing to LiteLLM so please let me know of any errors or mistakes.

@vercel
Copy link

vercel bot commented Oct 24, 2025

@adlumal is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@CLAassistant
Copy link

CLAassistant commented Oct 24, 2025

CLA assistant check
All committers have signed the CLA.

@adlumal adlumal force-pushed the feat/isaacus-embeddings-provider branch from 7d25621 to 58982a7 Compare October 24, 2025 07:13
@regismesquita
Copy link
Contributor

Looking forward to have this merged :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants