add glm-5, gpt-5-mini, deepseek-v3.2 models for vivgrid provider#915
add glm-5, gpt-5-mini, deepseek-v3.2 models for vivgrid provider#915rekram1-node merged 1 commit intoanomalyco:devfrom
Conversation
| temperature = true | ||
| knowledge = "2024-07" | ||
| tool_call = true | ||
| open_weights = true |
There was a problem hiding this comment.
isnt deepseek an interleaved reasoning model?
There was a problem hiding this comment.
Good question — DeepSeek-V3.2 is actually a hybrid / interleaved reasoning model rather than a dedicated reasoning-only model.
In most deployments (for example with vLLM), reasoning is not enabled by default and must be explicitly requested at runtime — e.g. by passing chat_template_kwargs.thinking=true in the request body. Without this, it behaves like a standard chat LLM.
In models.dev, the reasoning flag indicates that a model supports reasoning capabilities, not that it always operates in a dedicated reasoning mode. This is consistent with how other providers configure DeepSeek-V3.2.
In Vivgrid, we abstract these provider-specific details away — developers can enable reasoning using the same client interface they would use for GPT-5.2 , without needing to handle provider-specific request parameters.
e.g.
{
"model": "DeepSeek-V3.2",
"input": [
{
"role": "user",
"content": "Explain why the sky is blue"
}
],
"reasoning": {
"effort": "medium"
},
}
add 3 models: