-
Notifications
You must be signed in to change notification settings - Fork 787
Open
Description
Description
When using a function tool with a response model and providing its schema via the format
argument, the tool is never invoked. The same code works when format
is omitted, but breaks when a schema is passed using model_json_schema()
from a Pydantic model.
Code to Reproduce
from ollama import ChatResponse, chat
from pydantic import BaseModel, Field
import json
class AddTwoNumbersOutput(BaseModel):
"""
Output schema for the add_two_numbers function
"""
a: int = Field(..., description="The first number",)
b: int = Field(..., description="The second number",)
result: int = Field(..., description="The result of the addition",)
def add_two_numbers(a: int, b: int) -> int:
"""
Add two numbers
Args:
a (int): The first number
b (int): The second number
Returns:
int: The sum of the two numbers
"""
# The cast is necessary as returned tool call arguments don't always conform exactly to schema
# E.g. this would prevent "what is 30 + 12" to produce '3012' instead of 42
return int(a) + int(b)
messages = [{'role': 'user', 'content': 'What is three plus one?'}]
print('Prompt:', messages[0]['content'])
available_functions = {
'add_two_numbers': add_two_numbers,
}
response: ChatResponse = chat(
'llama3.1',
messages=messages,
tools=[add_two_numbers],
format=AddTwoNumbersOutput.model_json_schema(),
)
if response.message.tool_calls:
tool = response.message.tool_calls[0]
# Ensure the function is available, and then call it
if function_to_call := available_functions.get(tool.function.name):
print('Calling function:', tool.function.name)
print('Arguments:', tool.function.arguments)
output = function_to_call(**tool.function.arguments)
print('Function output:', output)
else:
print('Function', tool.function.name, 'not found')
# Only needed to chat with the model using the tool call results
if response.message.tool_calls:
# Add the function response to messages for the model to use
messages.append(response.message)
messages.append({'role': 'tool', 'content': str(output), 'tool_name': tool.function.name})
# Get final response from model with function outputs
final_response = chat('llama3.1', messages=messages)
print('Final response:', json.dumps(final_response.message.content, indent=2, ensure_ascii=False))
else:
print('No tool calls returned from model')
Observed Behavior
- With
format=...
: No tool call is made,response.message.tool_calls
is empty. - Without
format=...
: The functionadd_two_numbers
is correctly triggered and returns the result.
Expected Behavior
Providing a schema via format=AddTwoNumbersOutput.model_json_schema()
should not prevent tool calls from being triggered.
My Environment
- OS: macOS 15.6 (M1)
- Python: 3.13.5
- ollama-python: 0.5.1
- Ollama 0.10.1
- Model:
llama3.1
Metadata
Metadata
Assignees
Labels
No labels