Skip to content

Commit a7446d5

Browse files
stainless-botmegamanics
authored andcommittedAug 14, 2024
release: 1.14.1 (openai#1239)
* docs(readme): assistant streaming (openai#1238) * release: 1.14.1
1 parent 8e0f37a commit a7446d5

File tree

5 files changed

+172
-3
lines changed

5 files changed

+172
-3
lines changed
 

‎.release-please-manifest.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
{
2-
".": "1.14.0"
2+
".": "1.14.1"
33
}

‎CHANGELOG.md

+8
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,13 @@
11
# Changelog
22

3+
## 1.14.1 (2024-03-15)
4+
5+
Full Changelog: [v1.14.0...v1.14.1](https://github.com/openai/openai-python/compare/v1.14.0...v1.14.1)
6+
7+
### Documentation
8+
9+
* **readme:** assistant streaming ([#1238](https://github.com/openai/openai-python/issues/1238)) ([0fc30a2](https://github.com/openai/openai-python/commit/0fc30a23030b4ff60f27cd2f472517926ed0f300))
10+
311
## 1.14.0 (2024-03-13)
412

513
Full Changelog: [v1.13.4...v1.14.0](https://github.com/openai/openai-python/compare/v1.13.4...v1.14.0)

‎helpers.md

+161
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,161 @@
1+
# Streaming Helpers
2+
3+
OpenAI supports streaming responses when interacting with the [Assistant](#assistant-streaming-api) APIs.
4+
5+
## Assistant Streaming API
6+
7+
OpenAI supports streaming responses from Assistants. The SDK provides convenience wrappers around the API
8+
so you can subscribe to the types of events you are interested in as well as receive accumulated responses.
9+
10+
More information can be found in the documentation: [Assistant Streaming](https://platform.openai.com/docs/assistants/overview?lang=python)
11+
12+
#### An example of creating a run and subscribing to some events
13+
14+
You can subscribe to events by creating an event handler class and overloading the relevant event handlers.
15+
16+
```python
17+
from typing_extensions import override
18+
from openai import AssistantEventHandler
19+
20+
# First, we create a EventHandler class to define
21+
# how we want to handle the events in the response stream.
22+
23+
class EventHandler(AssistantEventHandler):
24+
@override
25+
def on_text_created(self, text) -> None:
26+
print(f"\nassistant > ", end="", flush=True)
27+
28+
@override
29+
def on_text_delta(self, delta, snapshot):
30+
print(delta.value, end="", flush=True)
31+
32+
def on_tool_call_created(self, tool_call):
33+
print(f"\nassistant > {tool_call.type}\n", flush=True)
34+
35+
def on_tool_call_delta(self, delta, snapshot):
36+
if delta.type == 'code_interpreter':
37+
if delta.code_interpreter.input:
38+
print(delta.code_interpreter.input, end="", flush=True)
39+
if delta.code_interpreter.outputs:
40+
print(f"\n\noutput >", flush=True)
41+
for output in delta.code_interpreter.outputs:
42+
if output.type == "logs":
43+
print(f"\n{output.logs}", flush=True)
44+
45+
# Then, we use the `create_and_stream` SDK helper
46+
# with the `EventHandler` class to create the Run
47+
# and stream the response.
48+
49+
with client.beta.threads.runs.create_and_stream(
50+
thread_id=thread.id,
51+
assistant_id=assistant.id,
52+
instructions="Please address the user as Jane Doe. The user has a premium account.",
53+
event_handler=EventHandler(),
54+
) as stream:
55+
stream.until_done()
56+
```
57+
58+
### Assistant Events
59+
60+
The assistant API provides events you can subscribe to for the following events.
61+
62+
```python
63+
def on_event(self, event: AssistantStreamEvent)
64+
```
65+
66+
This allows you to subscribe to all the possible raw events sent by the OpenAI streaming API.
67+
In many cases it will be more convenient to subscribe to a more specific set of events for your use case.
68+
69+
More information on the types of events can be found here: [Events](https://platform.openai.com/docs/api-reference/assistants-streaming/events)
70+
71+
```python
72+
def on_run_step_created(self, run_step: RunStep)
73+
def on_run_step_delta(self, delta: RunStepDelta, snapshot: RunStep)
74+
def on_run_step_done(self, run_step: RunStep)
75+
```
76+
77+
These events allow you to subscribe to the creation, delta and completion of a RunStep.
78+
79+
For more information on how Runs and RunSteps work see the documentation [Runs and RunSteps](https://platform.openai.com/docs/assistants/how-it-works/runs-and-run-steps)
80+
81+
```python
82+
def on_message_created(self, message: Message)
83+
def on_message_delta(self, delta: MessageDelta, snapshot: Message)
84+
def on_message_done(self, message: Message)
85+
```
86+
87+
This allows you to subscribe to Message creation, delta and completion events. Messages can contain
88+
different types of content that can be sent from a model (and events are available for specific content types).
89+
For convenience, the delta event includes both the incremental update and an accumulated snapshot of the content.
90+
91+
More information on messages can be found
92+
on in the documentation page [Message](https://platform.openai.com/docs/api-reference/messages/object).
93+
94+
```python
95+
def on_text_created(self, text: Text)
96+
def on_text_delta(self, delta: TextDelta, snapshot: Text)
97+
def on_text_done(self, text: Text)
98+
```
99+
100+
These events allow you to subscribe to the creation, delta and completion of a Text content (a specific type of message).
101+
For convenience, the delta event includes both the incremental update and an accumulated snapshot of the content.
102+
103+
```python
104+
def on_image_file_done(self, image_file: ImageFile)
105+
```
106+
107+
Image files are not sent incrementally so an event is provided for when a image file is available.
108+
109+
```python
110+
def on_tool_call_created(self, tool_call: ToolCall)
111+
def on_tool_call_delta(self, delta: ToolCallDelta, snapshot: ToolCall)
112+
def on_tool_call_done(self, tool_call: ToolCall)
113+
```
114+
115+
These events allow you to subscribe to events for the creation, delta and completion of a ToolCall.
116+
117+
More information on tools can be found here [Tools](https://platform.openai.com/docs/assistants/tools)
118+
119+
```python
120+
def on_end(self)
121+
```
122+
123+
The last event send when a stream ends.
124+
125+
```python
126+
def on_timeout(self)
127+
```
128+
129+
This event is triggered if the request times out.
130+
131+
```python
132+
def on_exception(self, exception: Exception)
133+
```
134+
135+
This event is triggered if an exception occurs during streaming.
136+
137+
### Assistant Methods
138+
139+
The assistant streaming object also provides a few methods for convenience:
140+
141+
```python
142+
def current_event()
143+
def current_run()
144+
def current_message_snapshot()
145+
def current_run_step_snapshot()
146+
```
147+
148+
These methods are provided to allow you to access additional context from within event handlers. In many cases
149+
the handlers should include all the information you need for processing, but if additional context is required it
150+
can be accessed.
151+
152+
Note: There is not always a relevant context in certain situations (these will be undefined in those cases).
153+
154+
```python
155+
def get_final_run(self)
156+
def get_final_run_steps(self)
157+
def get_final_messages(self)
158+
```
159+
160+
These methods are provided for convenience to collect information at the end of a stream. Calling these events
161+
will trigger consumption of the stream until completion and then return the relevant accumulated objects.

‎pyproject.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "openai"
3-
version = "1.14.0"
3+
version = "1.14.1"
44
description = "The official Python library for the openai API"
55
readme = "README.md"
66
license = "Apache-2.0"

‎src/openai/_version.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# File generated from our OpenAPI spec by Stainless.
22

33
__title__ = "openai"
4-
__version__ = "1.14.0" # x-release-please-version
4+
__version__ = "1.14.1" # x-release-please-version

0 commit comments

Comments
 (0)
Please sign in to comment.