Conversation
There was a problem hiding this comment.
Pull request overview
Updates the repo’s OpenAI .NET SDK dependency to 2.9.0 and adapts the OpenAI integration layer (Responses + Realtime + image generation) and tests to the updated SDK surface area.
Changes:
- Bump
OpenAIpackage version to 2.9.0 and update affected call patterns (notably Responses client model selection and image input handling). - Update Responses/Reatime conversion & tooling glue code to match the new SDK types/properties.
- Adjust tests to new SDK behaviors and payload shapes (e.g., image URL media type inference, instructions representation).
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.cs | Updates Responses client creation/adapter usage and expected media type for image URL content. |
| test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientIntegrationTests.cs | Updates integration test client creation to new Responses client + model wiring. |
| test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIConversionTests.cs | Updates conversion tests for new tool types/properties and instructions representation changes. |
| test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs | Updates expected request JSON for model/token fields in chat tests. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs | Refactors to accommodate SDK changes (model handling, image URI handling, reasoning effort enum changes). |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIRealtimeConversationClient.cs | Renames/updates realtime function tool mapping to new SDK type/property names. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIImageGenerator.cs | Adds media-type-to-output-format mapping for image edit options and minor boolean pattern tweaks. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIClientExtensions.cs | Updates ResponsesClient AsIChatClient signature to accept a default model id. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIChatClient.cs | Updates reasoning effort mapping and notes a 2.9.0 regression around model override patching. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/MicrosoftExtensionsAIResponsesExtensions.cs | Adjusts Responses result instructions mapping and updates doc cref signatures. |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/MicrosoftExtensionsAIRealtimeExtensions.cs | Renames realtime extension method to new tool type and updates docs accordingly. |
| eng/packages/General.props | Bumps OpenAI package version from 2.8.0 to 2.9.0. |
Comments suppressed due to low confidence (3)
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIClientExtensions.cs:127
defaultModelIdis optional, but when omitted it can cause requests to be sent without a model (depending onChatOptions.ModelId), which is likely to fail at runtime and is a behavior change from the previous implementation that used the response client's model. If possible, consider requiringdefaultModelId(or throwing when missing) to avoid silent misconfiguration.
/// <summary>Gets an <see cref="IChatClient"/> for use with this <see cref="ResponsesClient"/>.</summary>
/// <param name="responseClient">The client.</param>
/// <param name="defaultModelId">The default model ID to use for the chat client.</param>
/// <returns>An <see cref="IChatClient"/> that can be used to converse via the <see cref="ResponsesClient"/>.</returns>
/// <exception cref="ArgumentNullException"><paramref name="responseClient"/> is <see langword="null"/>.</exception>
[Experimental(DiagnosticIds.Experiments.AIOpenAIResponses)]
public static IChatClient AsIChatClient(this ResponsesClient responseClient, string? defaultModelId = null) =>
new OpenAIResponsesChatClient(responseClient, defaultModelId);
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs:1684
- These tests are named as if
ChatOptions.ModelIdoverrides the client model, but the expected request JSON now uses the client model (gpt-4o-mini) whileChatOptions.ModelIdis set togpt-4o. Either update the test name/assertions to reflect the new behavior, or (preferably) keep the test asserting the override behavior once the underlying issue is fixed.
This issue also appears on line 1692 of the same file.
[Fact]
public async Task ChatOptions_ModelId_OverridesClientModel_NonStreaming()
{
const string Input = """
{
"temperature":0.5,
"messages":[{"role":"user","content":"hello"}],
"model":"gpt-4o-mini",
"max_completion_tokens":10
}
""";
const string Output = """
{
"id": "chatcmpl-ADx3PvAnCwJg0woha4pYsBTi3ZpOI",
"object": "chat.completion",
"created": 1727888631,
"model": "gpt-4o-2024-08-06",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I assist you today?",
"refusal": null
},
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 8,
"completion_tokens": 9,
"total_tokens": 17
}
}
""";
using VerbatimHttpHandler handler = new(Input, Output);
using HttpClient httpClient = new(handler);
using IChatClient client = CreateChatClient(httpClient, "gpt-4o-mini");
var response = await client.GetResponseAsync("hello", new()
{
MaxOutputTokens = 10,
Temperature = 0.5f,
ModelId = "gpt-4o",
});
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs:1731
- Same issue as the non-streaming variant: the test name indicates
ChatOptions.ModelIdoverrides the client model, but the expected request JSON usesgpt-4o-miniwhileChatOptions.ModelIdis set togpt-4o. Please align the test name/assertions with the intended contract.
[Fact]
public async Task ChatOptions_ModelId_OverridesClientModel_Streaming()
{
const string Input = """
{
"temperature":0.5,
"messages":[{"role":"user","content":"hello"}],
"model":"gpt-4o-mini",
"max_completion_tokens":20,
"stream":true,
"stream_options":{"include_usage":true}
}
""";
const string Output = """
data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null}
data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"content":"Hello"},"logprobs":null,"finish_reason":null}],"usage":null}
data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}],"usage":null}
data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null}
data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[],"usage":{"prompt_tokens":8,"completion_tokens":9,"total_tokens":17}}
data: [DONE]
""";
using VerbatimHttpHandler handler = new(Input, Output);
using HttpClient httpClient = new(handler);
using IChatClient client = CreateChatClient(httpClient, "gpt-4o-mini");
List<ChatResponseUpdate> updates = [];
await foreach (var update in client.GetStreamingResponseAsync("hello", new()
{
MaxOutputTokens = 20,
Temperature = 0.5f,
ModelId = "gpt-4o",
}))
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Show resolved
Hide resolved
src/Libraries/Microsoft.Extensions.AI.OpenAI/MicrosoftExtensionsAIRealtimeExtensions.cs
Show resolved
Hide resolved
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIConversionTests.cs
Show resolved
Hide resolved
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Show resolved
Hide resolved
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs
Show resolved
Hide resolved
|
@stephentoub looks we need to get the new version package added to our infra internal NuGet cache. I used to ask the infra first responders to help in that, but I don't know if we still doing that. |
I am seeing you already did 😄 |
|
Blocked on openai/openai-dotnet#991 |
Microsoft Reviewers: Open in CodeFlow