You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Introduce a new ToolExecutionEligibilityChecker interface to provide a more flexible way to determine
when tool execution should be performed based on model responses. This abstraction replaces the
hardcoded logic previously scattered across the codebase.
- Adds a new ToolExecutionEligibilityChecker interface in spring-ai-core
- Integrates the checker into OpenAiChatModel with appropriate defaults
- Updates OpenAiChatAutoConfiguration to support the new interface
- Provides a default implementation that maintains backward compatibility
Signed-off-by: Christian Tzolov <[email protected]>
refactor: Replace ToolExecutionEligibilityChecker with ToolExecutionEligibilityPredicate
- Replacing ToolExecutionEligibilityChecker with ToolExecutionEligibilityPredicate
- Changing from Function<ChatResponse, Boolean> to BiPredicate<ChatOptions, ChatResponse>
- Adding a DefaultToolExecutionEligibilityPredicate implementation
- Updating AnthropicChatModel and OpenAiChatModel to use the new predicate
- Updating auto-configurations to inject the new predicate
- Adding comprehensive tests for the new predicate implementation
The new approach provides a cleaner and more consistent way to determine when tool execution should be performed based on both prompt options and chat responses.
Add Bedrock Converse support
add mistral support
Add ollama and vertex gemini
add ToolExecutionEligibilityPredicate docs
Signed-off-by: Christian Tzolov <[email protected]>
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-anthropic/src/main/java/org/springframework/ai/model/anthropic/autoconfigure/AnthropicChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-azure-openai/src/main/java/org/springframework/ai/model/azure/openai/autoconfigure/AzureOpenAiChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-bedrock-ai/src/main/java/org/springframework/ai/model/bedrock/converse/autoconfigure/BedrockConverseProxyChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-mistral-ai/src/main/java/org/springframework/ai/model/mistralai/autoconfigure/MistralAiChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-ollama/src/main/java/org/springframework/ai/model/ollama/autoconfigure/OllamaChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-openai/src/main/java/org/springframework/ai/model/openai/autoconfigure/OpenAiChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-vertex-ai/src/main/java/org/springframework/ai/model/vertexai/autoconfigure/gemini/VertexAiGeminiChatAutoConfiguration.java
Copy file name to clipboardExpand all lines: auto-configurations/models/spring-ai-autoconfigure-model-vertex-ai/src/test/java/org/springframework/ai/model/vertexai/autoconfigure/gemini/tool/FunctionCallWithPromptFunctionIT.java
+1-1
Original file line number
Diff line number
Diff line change
@@ -50,7 +50,7 @@ public class FunctionCallWithPromptFunctionIT {
0 commit comments