Open
Description
If stream = True
, we are noticing that a 400 returns (in the server logs):
StreamError("Invalid status code: 400 Bad Request")
On the client side (Python consumption):
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)
If stream = False
, then we still get the 400, but also more rich information:
Traceback (most recent call last):
File "test.py", line 5, in <module>
out = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "python3.12/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "python3.12/site-packages/openai/resources/chat/completions.py", line 829, in create
return self._post(
^^^^^^^^^^^
File "python3.12/site-packages/openai/_base_client.py", line 1277, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "python3.12/site-packages/openai/_base_client.py", line 954, in request
return self._request(
^^^^^^^^^^^^^^
File "python3.12/site-packages/openai/_base_client.py", line 1058, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: After the optional system message, conversation roles must alternate user/assistant/user/assistant/...
What would be great is that we are able to retrieve the enriched error information even if streaming is true. That is the current behavior with the Python client.