Skip to content

fix: respect configured context_size in embed_batch#67

Open
echobt wants to merge 1 commit intomainfrom
fix/issue-200
Open

fix: respect configured context_size in embed_batch#67
echobt wants to merge 1 commit intomainfrom
fix/issue-200

Conversation

@echobt
Copy link
Contributor

@echobt echobt commented Jan 21, 2026

Description

This PR fixes an issue where the configured context_size was ignored in embed_batch(). The function was initializing LlamaContextParams with defaults, causing it to use the model's default context size (usually 512) instead of the user's configuration.

Fix Details

  • Modified embed_batch() in src/core/embeddings.rs to explicitly set .with_n_ctx(...) using the stored self.n_ctx.
  • This ensures that batch embedding operations respect the configured context window size.

Verification

  • Verified that embed_batch() now creates a context with the correct size.
  • Ran existing tests with cargo test, all passed.

Fixes PlatformNetwork/bounty-challenge#200

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] context_size Configuration Ignored in embed_batch()

1 participant