Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

404 Error During Agent Handoff in Multi-Agent System #206

Open
BowenBryanWang opened this issue Mar 17, 2025 · 5 comments
Open

404 Error During Agent Handoff in Multi-Agent System #206

BowenBryanWang opened this issue Mar 17, 2025 · 5 comments
Labels
bug Something isn't working

Comments

@BowenBryanWang
Copy link

BowenBryanWang commented Mar 17, 2025

Please read this first

  • Have you read the docs?Agents SDK docs Yes
  • Have you searched for related issues? Others may have faced similar issues.

Describe the bug

During execution of a multi-agent task, our system encountered a critical 404 error when attempting to process a function call between agents. The error occurred specifically when the General Agent tried to hand off to the Search Agent to find NBA finals results.

Debug information

  • Agents SDK version: v0.0.4
  • Python version: Python 3.9

Log


2025-03-17 19:22:38 - openai_agents_integration.py:724 - INFO - [{'content': 'can you help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using MarkText, save it as "NBA.md"', 'role': 'user'}, {'id': 'fc_67d876754b2c8191bec6637d6c64a8d70ca151ec7ca34d36', 'arguments': '{}', 'call_id': 'call_Do6VuH7tpEBxBSoPuvACwDvo', 'name': 'transfer_to_search_agent', 'type': 'function_call', 'status': 'completed'}, {'id': 'fc_67d87675a94c81918773f9919a2b10b30ca151ec7ca34d36', 'arguments': '{}', 'call_id': 'call_S2ewJBvZ8bGaP5XcsQBiSLSZ', 'name': 'transfer_to_computer_agent', 'type': 'function_call', 'status': 'completed'}, {'call_id': 'call_S2ewJBvZ8bGaP5XcsQBiSLSZ', 'output': 'Multiple handoffs detected, ignoring this one.', 'type': 'function_call_output'}, {'call_id': 'call_Do6VuH7tpEBxBSoPuvACwDvo', 'output': "{'assistant': 'Search Agent'}", 'type': 'function_call_output'}]
2025-03-17 19:22:38 - openai_agents_integration.py:725 - INFO - Response(id='resp_67d8767667948191b587805df3631f820ca151ec7ca34d36', created_at=1742239350.0, error=None, incomplete_details=None, instructions="You are a searching agent connected to the Internet, you can help do information retrieval to gather useful information for the user's instruction. However, you cannot do any GUI actions on the computer unless you handoff the task to the Computer Agent.", metadata={}, model='gpt-4o-2024-08-06', object='response', output=[ResponseFunctionWebSearch(id='ws_67d87676eeac81918b0752e6fa32bc510ca151ec7ca34d36', status='completed', type='web_search_call'), ResponseOutputMessage(id='msg_67d8767987188191b9b16b6584c8f2280ca151ec7ca34d36', content=[ResponseOutputText(annotations=[], text='Here are the NBA Finals results from the past five years:\n\n| Year | Champion               | Series Result | Runner-Up         | Finals MVP             |\n|------|------------------------|---------------|-------------------|------------------------|\n| 2024 | Boston Celtics         | 4–1           | Dallas Mavericks  | Jaylen Brown           |\n| 2023 | Denver Nuggets         | 4–1           | Miami Heat        | Nikola Jokić           |\n| 2022 | Golden State Warriors  | 4–2           | Boston Celtics    | Stephen Curry          |\n| 2021 | Milwaukee Bucks        | 4–2           | Phoenix Suns      | Giannis Antetokounmpo  |\n| 2020 | Los Angeles Lakers     | 4–2           | Miami Heat        | LeBron James           |\n\nThese results highlight the competitive nature of the NBA over the past five seasons, with different teams achieving championship success each year. ', type='output_text')], role='assistant', status='completed', type='message'), ResponseFunctionToolCall(id='fc_67d8767bb470819190efb9fb355937970ca151ec7ca34d36', arguments='{}', call_id='call_Eb74hcO7M492eoQdIXRxxkyH', name='transfer_to_computer_agent', type='function_call', status='completed')], parallel_tool_calls=True, temperature=1.0, tool_choice='auto', tools=[FunctionTool(name='transfer_to_computer_agent', parameters={'additionalProperties': False, 'type': 'object', 'properties': {}, 'required': []}, strict=True, type='function', description='Handoff to the Computer Agent agent to handle the request. A real user computer environment to do GUI actions.'), WebSearchTool(type='web_search_preview', search_context_size='medium', user_location=UserLocation(type='approximate', city='New York', country=None, region=None, timezone=None))], top_p=1.0, max_output_tokens=None, previous_response_id=None, reasoning=Reasoning(effort=None, generate_summary=None), status='completed', text=ResponseTextConfig(format=ResponseFormatText(type='text')), truncation='disabled', usage=ResponseUsage(input_tokens=1265, output_tokens=228, output_tokens_details=OutputTokensDetails(reasoning_tokens=0), total_tokens=1493, input_tokens_details={'cached_tokens': 0}), user=None, store=False)

2025-03-17 19:22:36 - openai_agent.py:533 - ERROR - Error running OpenAI Agent: Error code: 404 - {'error': {'message': "Item with id 'fc_67d876754b2c8191bec6637d6c64a8d70ca151ec7ca34d36' not found.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}
Traceback (most recent call last):

  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/threading.py", line 937, in _bootstrap
    self._bootstrap_inner()
    │    └ <function Thread._bootstrap_inner at 0x77ae73f8fdc0>
    └ <DaemonThread(Thread-277, started daemon 131585582040640)>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/threading.py", line 980, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x77ae73f8faf0>
    └ <DaemonThread(Thread-277, started daemon 131585582040640)>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/threading.py", line 917, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <DaemonThread(Thread-277, started daemon 131585582040640)>
    │    │        │    └ (<socketio.server.Server object at 0x77ae71e04610>, 'dTs_zpIXxyXjfPalAAAd', '_hrfe3wpcicsAUcVAAAc', ['message', {'user_id': '...
    │    │        └ <DaemonThread(Thread-277, started daemon 131585582040640)>
    │    └ <bound method Server._handle_event_internal of <socketio.server.Server object at 0x77ae71e04610>>
    └ <DaemonThread(Thread-277, started daemon 131585582040640)>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/socketio/server.py", line 586, in _handle_event_internal
    r = server._trigger_event(data[0], namespace, sid, *data[1:])
        │      │              │        │          │     └ ['message', {'user_id': '62492e5d-2ed3-4469-91b1-1363cdbefeff', 'chat_id': 'cc2a679d-8050-4530-b5f9-3038f97f5719', 'user_inte...
        │      │              │        │          └ 'dTs_zpIXxyXjfPalAAAd'
        │      │              │        └ '/'
        │      │              └ ['message', {'user_id': '62492e5d-2ed3-4469-91b1-1363cdbefeff', 'chat_id': 'cc2a679d-8050-4530-b5f9-3038f97f5719', 'user_inte...
        │      └ <function Server._trigger_event at 0x77ae71e78820>
        └ <socketio.server.Server object at 0x77ae71e04610>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/socketio/server.py", line 611, in _trigger_event
    return handler(*args)
           │        └ ('dTs_zpIXxyXjfPalAAAd', {'user_id': '62492e5d-2ed3-4469-91b1-1363cdbefeff', 'chat_id': 'cc2a679d-8050-4530-b5f9-3038f97f5719...
           └ <function handle_message at 0x77ad466cb5e0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/flask_socketio/__init__.py", line 282, in _handler
    return self._handle_event(handler, message, namespace, sid,
           │    │             │        │        │          └ 'dTs_zpIXxyXjfPalAAAd'
           │    │             │        │        └ '/'
           │    │             │        └ 'message'
           │    │             └ <function handle_message at 0x77ad466c2af0>
           │    └ <function SocketIO._handle_event at 0x77ae71df3a60>
           └ <flask_socketio.SocketIO object at 0x77ae71e04220>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/flask_socketio/__init__.py", line 827, in _handle_event
    ret = handler(*args)
          │        └ ({'user_id': '62492e5d-2ed3-4469-91b1-1363cdbefeff', 'chat_id': 'cc2a679d-8050-4530-b5f9-3038f97f5719', 'user_intent': 'can y...
          └ <function handle_message at 0x77ad466c2af0>

  File "/home/ubuntu/workspace/VLMAgentArena/backend/api/conversation.py", line 35, in handle_message
    return loop.run_until_complete(async_handle_message(json))
           │    │                  │                    └ {'user_id': '62492e5d-2ed3-4469-91b1-1363cdbefeff', 'chat_id': 'cc2a679d-8050-4530-b5f9-3038f97f5719', 'user_intent': 'can yo...
           │    │                  └ <function async_handle_message at 0x77ad5749daf0>
           │    └ <function BaseEventLoop.run_until_complete at 0x77ae728301f0>
           └ <_UnixSelectorEventLoop running=True closed=False debug=False>

  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/asyncio/base_events.py", line 634, in run_until_complete
    self.run_forever()
    │    └ <function BaseEventLoop.run_forever at 0x77ae72830160>
    └ <_UnixSelectorEventLoop running=True closed=False debug=False>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/asyncio/base_events.py", line 601, in run_forever
    self._run_once()
    │    └ <function BaseEventLoop._run_once at 0x77ae72832ca0>
    └ <_UnixSelectorEventLoop running=True closed=False debug=False>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/asyncio/base_events.py", line 1905, in _run_once
    handle._run()
    │      └ <function Handle._run at 0x77ae7286e310>
    └ <Handle <TaskStepMethWrapper object at 0x77ad40ecd790>()>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
    │    │            │    │           │    └ <member '_args' of 'Handle' objects>
    │    │            │    │           └ <Handle <TaskStepMethWrapper object at 0x77ad40ecd790>()>
    │    │            │    └ <member '_callback' of 'Handle' objects>
    │    │            └ <Handle <TaskStepMethWrapper object at 0x77ad40ecd790>()>
    │    └ <member '_context' of 'Handle' objects>
    └ <Handle <TaskStepMethWrapper object at 0x77ad40ecd790>()>

  File "/home/ubuntu/workspace/VLMAgentArena/backend/api/conversation.py", line 293, in execute_agent
    return await agent.run(task_instruction=task_instruction)
                 │     │                    └ 'can you help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using Mar...
                 │     └ <function OpenAIAgentWrapper.run at 0x77ad45966940>
                 └ <backend.agents.hub.OpenAIAgents.openai_agent.OpenAIAgentWrapper object at 0x77ad4cd66b20>

  File "/home/ubuntu/workspace/VLMAgentArena/backend/agents/BaseAgent.py", line 150, in async_wrapper
    result = await func(self, *args, **kwargs)
                   │    │      │       └ {'task_instruction': 'can you help me search for the NBA finals result of last 5 years and then write the results in a new ma...
                   │    │      └ ()
                   │    └ <backend.agents.hub.OpenAIAgents.openai_agent.OpenAIAgentWrapper object at 0x77ad4cd66b20>
                   └ <function OpenAIAgentWrapper.run at 0x77ad459668b0>

> File "/home/ubuntu/workspace/VLMAgentArena/backend/agents/hub/OpenAIAgents/openai_agent.py", line 525, in run
    self.last_result = await Runner.run(self.openai_agent, task_instruction)
    │    │                   │      │   │    │             └ 'can you help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using Mar...
    │    │                   │      │   │    └ Agent(name='General Agent', instructions=("You are a general digital agent. Your task is to help understand the user's instru...
    │    │                   │      │   └ <backend.agents.hub.OpenAIAgents.openai_agent.OpenAIAgentWrapper object at 0x77ad4cd66b20>
    │    │                   │      └ <classmethod object at 0x77ad459a8bb0>
    │    │                   └ <class 'agents.run.Runner'>
    │    └ None
    └ <backend.agents.hub.OpenAIAgents.openai_agent.OpenAIAgentWrapper object at 0x77ad4cd66b20>

  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 229, in run
    turn_result = await cls._run_single_turn(
                        │   └ <classmethod object at 0x77ad459a8b80>
                        └ <class 'agents.run.Runner'>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 719, in _run_single_turn
    new_response = await cls._get_new_response(
                         │   └ <classmethod object at 0x77ad459a8af0>
                         └ <class 'agents.run.Runner'>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 862, in _get_new_response
    new_response = await model.get_response(
                         │     └ <function OpenAIResponsesModel.get_response at 0x77ad459f95e0>
                         └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x77ad404531c0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/models/openai_responses.py", line 75, in get_response
    response = await self._fetch_response(
                     │    └ <function OpenAIResponsesModel._fetch_response at 0x77ad459f9700>
                     └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x77ad404531c0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/models/openai_responses.py", line 230, in _fetch_response
    return await self._client.responses.create(
                 │    │       │         └ <function AsyncResponses.create at 0x77ad4d0d9550>
                 │    │       └ <openai.resources.responses.responses.AsyncResponses object at 0x77ad4044d970>
                 │    └ <openai.AsyncOpenAI object at 0x77ad47eb2730>
                 └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x77ad404531c0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/resources/responses/responses.py", line 1414, in create
    return await self._post(
                 │    └ <bound method AsyncAPIClient.post of <openai.AsyncOpenAI object at 0x77ad47eb2730>>
                 └ <openai.resources.responses.responses.AsyncResponses object at 0x77ad4044d970>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1767, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
                 │    │       │        │            │                  └ openai.AsyncStream[typing.Annotated[typing.Union[openai.types.responses.response_audio_delta_event.ResponseAudioDeltaEvent, o...
                 │    │       │        │            └ False
                 │    │       │        └ FinalRequestOptions(method='post', url='/responses', params={}, headers={'User-Agent': 'Agents/Python 0.0.0'}, max_retries=NO...
                 │    │       └ <class 'openai.types.responses.response.Response'>
                 │    └ <function AsyncAPIClient.request at 0x77ad4dbdcc10>
                 └ <openai.AsyncOpenAI object at 0x77ad47eb2730>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1461, in request
    return await self._request(
                 │    └ <function AsyncAPIClient._request at 0x77ad4dbdcca0>
                 └ <openai.AsyncOpenAI object at 0x77ad47eb2730>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1562, in _request
    raise self._make_status_error_from_response(err.response) from None
          │    └ <function BaseClient._make_status_error_from_response at 0x77ad4dbd2dc0>
          └ <openai.AsyncOpenAI object at 0x77ad47eb2730>

openai.NotFoundError: Error code: 404 - {'error': {'message': "Item with id 'fc_67d876754b2c8191bec6637d6c64a8d70ca151ec7ca34d36' not found.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}


Repro steps

computer_agent = Agent(
    name="Computer Agent",
    instructions="You are a real user computer agent, which means that you are connected to a real user's computer and granted full access to it. Your task is to help transfer the user's instructions to the computer and do the actions on the computer iteratively to finish the task. Also, you can handoff the task to the Search Agent as needed if you need to do online information retrieval.",
    tools=[ComputerTool(computer)],
    model="computer-use-preview",
    model_settings=ModelSettings(truncation="auto"),
    handoff_description="A real user computer environment to do GUI actions.",
)
search_agent = Agent(
    name="Search Agent",
    instructions="You are a searching agent connected to the Internet, you can help do information retrieval to gather useful information for the user's instruction. However, you cannot do any GUI actions on the computer unless you handoff the task to the Computer Agent.",
    tools=[WebSearchTool(user_location={"type": "approximate", "city": "New York"})],
    handoff_description="A search engine to do retrival actions."
)
computer_agent.handoffs.append(search_agent)
search_agent.handoffs.append(computer_agent)
all_agent = Agent(
    name="General Agent",
    instructions="You are a general digital agent. Your task is to help understand the user's instructions and help execute the task in a real computer environment, which is controlled by the Computer Agent. You can break down the task into smaller steps and delegate the steps to the corresponding agents. Remember always ground the task into the real computer environment by assigning the Computer Agent to do the actions. You can handoff the task to the Search Agent as needed if you need to do online information retrieval. But ALWAYS remember: you can only handoff the sub-task to one Agent at a time, which means you cannot handoff the task to both Computer Agent and Search Agent at the same time.",
    handoffs=[computer_agent,search_agent],
)
result = await Runner.run(instruction="can you help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using MarkText, save it as 'NBA.md')

Call Flow

User requested: "can you help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using MarkText, save it as "NBA.md""
General Agent correctly determined this required both search capabilities and computer interaction
General Agent attempted to make parallel tool calls:
transfer_to_search_agent (ID: fc_67d876754b2c8191bec6637d6c64a8d70ca151ec7ca34d36)
transfer_to_computer_agent (ID: fc_67d87675a94c81918773f9919a2b10b30ca151ec7ca34d36)
When processing these calls, the system attempted to retrieve the function call by ID, but received a 404 error

Expected Behavior

The handoff between agents should occur without any 404 errors. Function calls should remain accessible throughout the entire processing cycle.

@BowenBryanWang BowenBryanWang added the bug Something isn't working label Mar 17, 2025
@rm-openai
Copy link
Collaborator

I was not able to reproduce your error. I think you maybe edited the example? I updated Runner.run to use all_agent and input instead of instructions, and got a different error about not being able to combine web search and computer use.

Are you able to provide a different repo, ideally one I can copy paste and run without edits?

@MoyezM
Copy link

MoyezM commented Mar 17, 2025

@rm-openai

I'm running into a similar issue where the FileSearchTool returns a 404 when continuing a conversation. The logs show a POST to https://api.openai.com/v1/responses returning a 404 with the error:

Item with id 'fs_67d87ad142008191be5002de593d70370eb55645955e301a' not found.

I've traced it down to our organization's Zero-Data Retention (ZDR) policy. When using an API key with ZDR, the item isn’t created, leading to the error. The same behavior can be reproduced using the Responses API directly via the playground.

I've also verified that API keys from an account without ZDR don't reproduce the issue.

Maybe we could add a custom handler for the InputMessage transformation to ignore / transform certain messages, which might help bypass cases where ZDR impacts item creation. Any thoughts on this approach?

Love the library btw!

@rm-openai
Copy link
Collaborator

@MoyezM do you have a simple repro case? I'd like to debug that!

@MoyezM
Copy link

MoyezM commented Mar 17, 2025

Yup! Here you go!

async def reproduce_404(vector_store_id: str):
    agent = Agent(
        name="404Repro",
        instructions="Test Agent",
        tools=[
            FileSearchTool(vector_store_ids=[vector_store_id]),
        ],
    )

    run = await Runner.run(agent, "What files are in the vector store?")

    input_list = run.to_input_list()
    input_list.append(
        {
            "type": "message",
            "role": "user",
            "content": "Read the files in the vector store",
        }
    )
    run = await Runner.run(agent, input_list)

    print(run.raw_responses)
In [5]: await reproduce_404("vs_67d1d4a737c48191b1611b8e46377836")
{"logger": "httpx", "level": "info", "func_name": "_send_single_request", "message": "HTTP Request: POST https://api.openai.com/v1/responses \"HTTP/1.1 200 OK\"", "timestamp": "2025-03-17T22:11:24.664415Z"}
{"logger": "httpx", "level": "info", "func_name": "_send_single_request", "message": "HTTP Request: POST https://api.openai.com/v1/responses \"HTTP/1.1 404 Not Found\"", "timestamp": "2025-03-17T22:11:24.872981Z"}
{"logger": "openai.agents", "level": "error", "func_name": "get_response", "message": "Error getting response: Error code: 404 - {'error': {'message': \"Item with id 'fs_67d89e0ba740819198b170a3bf83c6c50cd8ebb1298d73de' not found.\", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}. (request_id: req_a405e3db160f5e0e91a901b1cd1c004f)", "timestamp": "2025-03-17T22:11:24.970795Z"}
---------------------------------------------------------------------------
NotFoundError                             Traceback (most recent call last)
Cell In[5], line 1
----> 1 await reproduce_404("vs_67d1d4a737c48191b1611b8e46377836")

File ~/Documents/code/backend/app/domain/agents/repro.py:23, in reproduce_404(vector_store_id)
     15 input_list = run.to_input_list()
     16 input_list.append(
     17     {
     18         "type": "message",
   (...)
     21     }
     22 )
---> 23 run = await Runner.run(agent, input_list)
     25 print(run.raw_responses)

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/agents/run.py:210, in Runner.run(cls, starting_agent, input, context, max_turns, hooks, run_config)
    205 logger.debug(
    206     f"Running agent {current_agent.name} (turn {current_turn})",
    207 )
    209 if current_turn == 1:
--> 210     input_guardrail_results, turn_result = await asyncio.gather(
    211         cls._run_input_guardrails(
    212             starting_agent,
    213             starting_agent.input_guardrails
    214             + (run_config.input_guardrails or []),
    215             copy.deepcopy(input),
    216             context_wrapper,
    217         ),
    218         cls._run_single_turn(
    219             agent=current_agent,
    220             original_input=original_input,
    221             generated_items=generated_items,
    222             hooks=hooks,
    223             context_wrapper=context_wrapper,
    224             run_config=run_config,
    225             should_run_agent_start_hooks=should_run_agent_start_hooks,
    226         ),
    227     )
    228 else:
    229     turn_result = await cls._run_single_turn(
    230         agent=current_agent,
    231         original_input=original_input,
   (...)
    236         should_run_agent_start_hooks=should_run_agent_start_hooks,
    237     )

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/agents/run.py:719, in Runner._run_single_turn(cls, agent, original_input, generated_items, hooks, context_wrapper, run_config, should_run_agent_start_hooks)
    716 input = ItemHelpers.input_to_new_input_list(original_input)
    717 input.extend([generated_item.to_input_item() for generated_item in generated_items])
--> 719 new_response = await cls._get_new_response(
    720     agent,
    721     system_prompt,
    722     input,
    723     output_schema,
    724     handoffs,
    725     context_wrapper,
    726     run_config,
    727 )
    729 return await cls._get_single_step_result_from_response(
    730     agent=agent,
    731     original_input=original_input,
   (...)
    738     run_config=run_config,
    739 )

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/agents/run.py:862, in Runner._get_new_response(cls, agent, system_prompt, input, output_schema, handoffs, context_wrapper, run_config)
    860 model = cls._get_model(agent, run_config)
    861 model_settings = agent.model_settings.resolve(run_config.model_settings)
--> 862 new_response = await model.get_response(
    863     system_instructions=system_prompt,
    864     input=input,
    865     model_settings=model_settings,
    866     tools=agent.tools,
    867     output_schema=output_schema,
    868     handoffs=handoffs,
    869     tracing=get_model_tracing_impl(
    870         run_config.tracing_disabled, run_config.trace_include_sensitive_data
    871     ),
    872 )
    874 context_wrapper.usage.add(new_response.usage)
    876 return new_response

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/agents/models/openai_responses.py:75, in OpenAIResponsesModel.get_response(self, system_instructions, input, model_settings, tools, output_schema, handoffs, tracing)
     73 with response_span(disabled=tracing.is_disabled()) as span_response:
     74     try:
---> 75         response = await self._fetch_response(
     76             system_instructions,
     77             input,
     78             model_settings,
     79             tools,
     80             output_schema,
     81             handoffs,
     82             stream=False,
     83         )
     85         if _debug.DONT_LOG_MODEL_DATA:
     86             logger.debug("LLM responsed")

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/agents/models/openai_responses.py:230, in OpenAIResponsesModel._fetch_response(self, system_instructions, input, model_settings, tools, output_schema, handoffs, stream)
    220 else:
    221     logger.debug(
    222         f"Calling LLM {self.model} with input:\n"
    223         f"{json.dumps(list_input, indent=2)}\n"
   (...)
    227         f"Response format: {response_format}\n"
    228     )
--> 230 return await self._client.responses.create(
    231     instructions=self._non_null_or_not_given(system_instructions),
    232     model=self.model,
    233     input=list_input,
    234     include=converted_tools.includes,
    235     tools=converted_tools.tools,
    236     temperature=self._non_null_or_not_given(model_settings.temperature),
    237     top_p=self._non_null_or_not_given(model_settings.top_p),
    238     truncation=self._non_null_or_not_given(model_settings.truncation),
    239     max_output_tokens=self._non_null_or_not_given(model_settings.max_tokens),
    240     tool_choice=tool_choice,
    241     parallel_tool_calls=parallel_tool_calls,
    242     stream=stream,
    243     extra_headers=_HEADERS,
    244     text=response_format,
    245 )

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/openai/resources/responses/responses.py:1414, in AsyncResponses.create(self, input, model, include, instructions, max_output_tokens, metadata, parallel_tool_calls, previous_response_id, reasoning, store, stream, temperature, text, tool_choice, tools, top_p, truncation, user, extra_headers, extra_query, extra_body, timeout)
   1385 @required_args(["input", "model"], ["input", "model", "stream"])
   1386 async def create(
   1387     self,
   (...)
   1412     timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
   1413 ) -> Response | AsyncStream[ResponseStreamEvent]:
-> 1414     return await self._post(
   1415         "/responses",
   1416         body=await async_maybe_transform(
   1417             {
   1418                 "input": input,
   1419                 "model": model,
   1420                 "include": include,
   1421                 "instructions": instructions,
   1422                 "max_output_tokens": max_output_tokens,
   1423                 "metadata": metadata,
   1424                 "parallel_tool_calls": parallel_tool_calls,
   1425                 "previous_response_id": previous_response_id,
   1426                 "reasoning": reasoning,
   1427                 "store": store,
   1428                 "stream": stream,
   1429                 "temperature": temperature,
   1430                 "text": text,
   1431                 "tool_choice": tool_choice,
   1432                 "tools": tools,
   1433                 "top_p": top_p,
   1434                 "truncation": truncation,
   1435                 "user": user,
   1436             },
   1437             response_create_params.ResponseCreateParams,
   1438         ),
   1439         options=make_request_options(
   1440             extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
   1441         ),
   1442         cast_to=Response,
   1443         stream=stream or False,
   1444         stream_cls=AsyncStream[ResponseStreamEvent],
   1445     )

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/openai/_base_client.py:1767, in AsyncAPIClient.post(self, path, cast_to, body, files, options, stream, stream_cls)
   1753 async def post(
   1754     self,
   1755     path: str,
   (...)
   1762     stream_cls: type[_AsyncStreamT] | None = None,
   1763 ) -> ResponseT | _AsyncStreamT:
   1764     opts = FinalRequestOptions.construct(
   1765         method="post", url=path, json_data=body, files=await async_to_httpx_files(files), **options
   1766     )
-> 1767     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/openai/_base_client.py:1461, in AsyncAPIClient.request(self, cast_to, options, stream, stream_cls, remaining_retries)
   1458 else:
   1459     retries_taken = 0
-> 1461 return await self._request(
   1462     cast_to=cast_to,
   1463     options=options,
   1464     stream=stream,
   1465     stream_cls=stream_cls,
   1466     retries_taken=retries_taken,
   1467 )

File ~/Library/Caches/pypoetry/virtualenvs/app-6OFutBOr-py3.12/lib/python3.12/site-packages/openai/_base_client.py:1562, in AsyncAPIClient._request(self, cast_to, options, stream, stream_cls, retries_taken)
   1559         await err.response.aread()
   1561     log.debug("Re-raising status error")
-> 1562     raise self._make_status_error_from_response(err.response) from None
   1564 return await self._process_response(
   1565     cast_to=cast_to,
   1566     options=options,
   (...)
   1570     retries_taken=retries_taken,
   1571 )

NotFoundError: Error code: 404 - {'error': {'message': "Item with id 'fs_67d89e0ba740819198b170a3bf83c6c50cd8ebb1298d73de' not found.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}

@BowenBryanWang
Copy link
Author

hi @rm-openai, I fixed the issue above by changing my api_key, thanks for your help!

But sadly I came across another bug which tells:

  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 229, in run
    turn_result = await cls._run_single_turn(
                        │   └ <classmethod object at 0x7140f1ec9190>
                        └ <class 'agents.run.Runner'>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 719, in _run_single_turn
    new_response = await cls._get_new_response(
                         │   └ <classmethod object at 0x7140f1ec9220>
                         └ <class 'agents.run.Runner'>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/run.py", line 862, in _get_new_response
    new_response = await model.get_response(
                         │     └ <function OpenAIResponsesModel.get_response at 0x7140f1f135e0>
                         └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x7140edca55e0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/models/openai_responses.py", line 75, in get_response
    response = await self._fetch_response(
                     │    └ <function OpenAIResponsesModel._fetch_response at 0x7140f1f13700>
                     └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x7140edca55e0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/agents/models/openai_responses.py", line 230, in _fetch_response
    return await self._client.responses.create(
                 │    │       │         └ <function AsyncResponses.create at 0x7140f9a144c0>
                 │    │       └ <openai.resources.responses.responses.AsyncResponses object at 0x7140ee666940>
                 │    └ <openai.AsyncOpenAI object at 0x7140ee76a460>
                 └ <agents.models.openai_responses.OpenAIResponsesModel object at 0x7140edca55e0>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/resources/responses/responses.py", line 1414, in create
    return await self._post(
                 │    └ <bound method AsyncAPIClient.post of <openai.AsyncOpenAI object at 0x7140ee76a460>>
                 └ <openai.resources.responses.responses.AsyncResponses object at 0x7140ee666940>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1767, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
                 │    │       │        │            │                  └ openai.AsyncStream[typing.Annotated[typing.Union[openai.types.responses.response_audio_delta_event.ResponseAudioDeltaEvent, o...
                 │    │       │        │            └ False
                 │    │       │        └ FinalRequestOptions(method='post', url='/responses', params={}, headers={'User-Agent': 'Agents/Python 0.0.0'}, max_retries=NO...
                 │    │       └ <class 'openai.types.responses.response.Response'>
                 │    └ <function AsyncAPIClient.request at 0x7140fbf38f70>
                 └ <openai.AsyncOpenAI object at 0x7140ee76a460>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1461, in request
    return await self._request(
                 │    └ <function AsyncAPIClient._request at 0x7140fbf33040>
                 └ <openai.AsyncOpenAI object at 0x7140ee76a460>
  File "/home/ubuntu/anaconda3/envs/agent-arena-backend/lib/python3.9/site-packages/openai/_base_client.py", line 1562, in _request
    raise self._make_status_error_from_response(err.response) from None
          │    └ <function BaseClient._make_status_error_from_response at 0x7140fbf3c160>
          └ <openai.AsyncOpenAI object at 0x7140ee76a460>

openai.BadRequestError: Error code: 400 - {'error': {'message': 'Computer tool can not be used with Web search call.', 'type': 'invalid_request_error', 'param': 'input', 'code': None}}

It seems like I cannot combine a ComputerAgent with SearchAgent, I'm a bit confused...

Here I created another script to reproduce my case:

from computers import DockerComputer
from agents import (
    Agent,
    ComputerTool,
    ModelSettings,
    Runner,
    WebSearchTool,
)
with DockerComputer() as computer:
    computer_agent = Agent(
        name="Computer Agent",
        instructions="You are a real user computer agent, which means that you are connected to a real user's computer and granted full access to it. Your task is to help transfer the user's instructions to the computer and do the actions on the computer iteratively to finish the task. Also, you can handoff the task to the Search Agent as needed if you need to do online information retrieval.",
        tools=[ComputerTool(computer)],
        model="computer-use-preview",
        model_settings=ModelSettings(truncation="auto"),
        handoff_description="A real user computer environment to do GUI actions.",
    )
    search_agent = Agent(
        name="Search Agent",
        instructions="You are a searching agent connected to the Internet, you can help do information retrieval to gather useful information for the user's instruction. However, you cannot do any GUI actions on the computer unless you handoff the task to the Computer Agent.",
        tools=[WebSearchTool(user_location={"type": "approximate", "city": "New York"})],
        handoff_description="A search engine to do retrival actions.",
    )
    computer_agent.handoffs.append(search_agent)
    search_agent.handoffs.append(computer_agent)
    triage_agent = Agent(
        name="Triage Agent",
        instructions="You are a general digital agent. Your task is to help understand the user's instructions and help execute the task in a real computer environment, which is controlled by the Computer Agent. You can break down the task into smaller steps and delegate the steps to the corresponding agents. Remember always ground the task into the real computer environment by assigning the Computer Agent to do the actions. You can handoff the task to the Search Agent as needed if you need to do online information retrieval. But ALWAYS remember: you can only handoff the sub-task to one Agent at a time, which means you cannot handoff the task to both Computer Agent and Search Agent at the same time.",
        handoffs=[computer_agent, search_agent],
    )

async def main():
    result = await Runner.run(
        triage_agent,
        input="help me search for the NBA finals result of last 5 years and then write the results in a new markdown file using MarkText, save it as 'NBA.md'",
    )
    print(result)

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())

Could you please help me solve this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants