Bug description
The truncate parameter in the ChatUI configuration is not being applied when using the OpenAI chat_completions endpoint.
Root Cause
The issue arises because the chat_completions endpoint does not utilize the buildPrompt function where the truncate parameter is handled. The logic for truncation is solely within buildPrompt and is therefore bypassed entirely when processing chat_completions requests. This means there's no truncation mechanism applied to the chat history before it's sent to vllm-openai or OpenAI.
#1654
Bug description
The
truncateparameter in the ChatUI configuration is not being applied when using the OpenAI chat_completions endpoint.Root Cause
The issue arises because the chat_completions endpoint does not utilize the buildPrompt function where the
truncateparameter is handled. The logic for truncation is solely within buildPrompt and is therefore bypassed entirely when processing chat_completions requests. This means there's no truncation mechanism applied to the chat history before it's sent to vllm-openai or OpenAI.#1654