-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
Expected Behavior
Essentially, title.
Some of the modern UI frameworks for LLMs tend to standardize on using what's called an "OpenAI compliant API" response format, where, essentially, you can use any LLM+server-side framework combination you want as long as you adhere to the OpenAI response format.
Current Behavior
The ChatResponse class is not directly compatible with OpenAI which is, of course, desirable, as Spring AI aims to be a generic framework that abstracts over many different providers with a unified API.
The cost of writing a small "wrapper record" is negligible and it works okay, but, is there some function or method that we can call somewhere in the framework to make the responses be OpenAI-compliant?
Context
Essentially, many clients standardize on OpenAI compliant formats which can make some of the output of SpringAI needing to be adapted. It could make sense to offer this out of the box.
Example of a wrapper record for streaming "OpenAI chunks":
public class OpenAIStreamingResponse {
public record ChatCompletionChunk(
String id,
String object,
long created,
String model,
@JsonProperty("system_fingerprint") String systemFingerprint,
List<ChunkChoice> choices) {}
public record ChunkChoice(
int index, Delta delta, Object logprobs, @JsonProperty("finish_reason") String finishReason) {}
public record Delta(String role, String content) {}
}