Skip to content

Commit

Permalink
feat (provider/openai): improve automatic setting removal for reasoni…
Browse files Browse the repository at this point in the history
…ng models (#4321)
  • Loading branch information
lgrammel authored Jan 8, 2025
1 parent 03a66e9 commit 4d2f97b
Show file tree
Hide file tree
Showing 2 changed files with 60 additions and 5 deletions.
5 changes: 5 additions & 0 deletions .changeset/neat-laws-lay.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@ai-sdk/openai': patch
---

feat (provider/openai): improve automatic setting removal for reasoning models
60 changes: 55 additions & 5 deletions packages/openai/src/openai-chat-language-model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -191,12 +191,62 @@ export class OpenAIChatLanguageModel implements LanguageModelV1 {
}),
};

// reasoning models have fixed params, remove them if they are set:
// remove unsupported settings for reasoning models
// see https://platform.openai.com/docs/guides/reasoning#limitations
if (isReasoningModel(this.modelId)) {

This comment has been minimized.

Copy link
@labenz

labenz Jan 10, 2025

@lgrammel – could we add support for max_tokens here somehow too?

I'm currently seeing:

{"error":"Error: O1 error: {"error":{"message":"Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.","type":"invalid_request_error","param":"max_tokens","code":"unsupported_parameter"}}"}

This comment has been minimized.

Copy link
@lgrammel

lgrammel Jan 10, 2025

Author Collaborator

probably will map to max_completion_tokens automatically

This comment has been minimized.

Copy link
@lgrammel

lgrammel Jan 10, 2025

Author Collaborator
baseArgs.temperature = undefined;
baseArgs.top_p = undefined;
baseArgs.frequency_penalty = undefined;
baseArgs.presence_penalty = undefined;
if (baseArgs.temperature != null) {
baseArgs.temperature = undefined;
warnings.push({
type: 'unsupported-setting',
setting: 'temperature',
details: 'temperature is not supported for reasoning models',
});
}
if (baseArgs.top_p != null) {
baseArgs.top_p = undefined;
warnings.push({
type: 'unsupported-setting',
setting: 'topP',
details: 'topP is not supported for reasoning models',
});
}
if (baseArgs.frequency_penalty != null) {
baseArgs.frequency_penalty = undefined;
warnings.push({
type: 'unsupported-setting',
setting: 'frequencyPenalty',
details: 'frequencyPenalty is not supported for reasoning models',
});
}
if (baseArgs.presence_penalty != null) {
baseArgs.presence_penalty = undefined;
warnings.push({
type: 'unsupported-setting',
setting: 'presencePenalty',
details: 'presencePenalty is not supported for reasoning models',
});
}
if (baseArgs.logit_bias != null) {
baseArgs.logit_bias = undefined;
warnings.push({
type: 'other',
message: 'logitBias is not supported for reasoning models',
});
}
if (baseArgs.logprobs != null) {
baseArgs.logprobs = undefined;
warnings.push({
type: 'other',
message: 'logprobs is not supported for reasoning models',
});
}
if (baseArgs.top_logprobs != null) {
baseArgs.top_logprobs = undefined;
warnings.push({
type: 'other',
message: 'topLogprobs is not supported for reasoning models',
});
}
}

switch (type) {
Expand Down

0 comments on commit 4d2f97b

Please sign in to comment.