You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I Implemented the example code of Swarm (Travel_agent, flight_refund and user) in the VSCode using AzureOpenAIChatCompletionClient.
To my notice when a user provides feedback answer the next iteration fails.
see below error:
Error processing publish message for travel_agent/a72ed221-56c2-4bf3-a7ce-111a9e35c224
Traceback (most recent call last):
File "{PATH}\site-packages\autogen_core_single_threaded_agent_runtime.py", line 409, in _on_message
return await agent.on_message(
^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_agentchat\teams_group_chat_sequential_routed_agent.py", line 48, in on_message_impl
return await super().on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_routed_agent.py", line 268, in wrapper
return_value = await func(self, message, ctx) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_agentchat\teams_group_chat_chat_agent_container.py", line 53, in handle_request
async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
File "{PATH}\site-packages\autogen_agentchat\agents_assistant_agent.py", line 330, in on_messages_stream
result = await self._model_client.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_ext\models\openai_openai_client.py", line 494, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
^^^^^^^^^^^^
File "{PATH}\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1849, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1543, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1629, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1676, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1629, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1676, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1644, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
What did you expect to happen?
Example code should run as expected
How can we reproduce it (as minimally and precisely as possible)?
just run code in the blow link using AZUREOPENAIChatCompletionClient
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
This means the model endpoint throws a 500 internal server error. It's basically black box to us.
We see this from openai models recently as well. Consider switching the model version.
What happened?
I Implemented the example code of Swarm (Travel_agent, flight_refund and user) in the VSCode using AzureOpenAIChatCompletionClient.
To my notice when a user provides feedback answer the next iteration fails.
see below error:
Error processing publish message for travel_agent/a72ed221-56c2-4bf3-a7ce-111a9e35c224
Traceback (most recent call last):
File "{PATH}\site-packages\autogen_core_single_threaded_agent_runtime.py", line 409, in _on_message
return await agent.on_message(
^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_base_agent.py", line 113, in on_message
return await self.on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_agentchat\teams_group_chat_sequential_routed_agent.py", line 48, in on_message_impl
return await super().on_message_impl(message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_routed_agent.py", line 485, in on_message_impl
return await h(self, message, ctx)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_core_routed_agent.py", line 268, in wrapper
return_value = await func(self, message, ctx) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_agentchat\teams_group_chat_chat_agent_container.py", line 53, in handle_request
async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
File "{PATH}\site-packages\autogen_agentchat\agents_assistant_agent.py", line 330, in on_messages_stream
result = await self._model_client.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\autogen_ext\models\openai_openai_client.py", line 494, in create
result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
^^^^^^^^^^^^
File "{PATH}\site-packages\openai\resources\chat\completions.py", line 1720, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1849, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1543, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1629, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1676, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1629, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1676, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "{PATH}\site-packages\openai_base_client.py", line 1644, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
What did you expect to happen?
Example code should run as expected
How can we reproduce it (as minimally and precisely as possible)?
just run code in the blow link using AZUREOPENAIChatCompletionClient
https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/swarm.html
AutoGen version
0.4.3
Which package was this bug in
Core
Model used
gpt-4o-mini
Python version
3.11.1
Operating system
Windows
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: