You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+29
Original file line number
Diff line number
Diff line change
@@ -239,6 +239,35 @@ You can set the response_format to ask for responses in JSON (at least for `gpt-
239
239
}
240
240
```
241
241
242
+
You can stream it as well!
243
+
244
+
```ruby
245
+
response = client.chat(
246
+
parameters: {
247
+
model:"gpt-3.5-turbo-1106",
248
+
messages: [{ role:"user", content:"Can I have some JSON please?"}],
249
+
response_format: { type:"json_object" },
250
+
stream:procdo |chunk, _bytesize|
251
+
print chunk.dig("choices", 0, "delta", "content")
252
+
end
253
+
})
254
+
{
255
+
"message": "Sure, please let me know what specific JSON data you are looking for.",
256
+
"JSON_data": {
257
+
"example_1": {
258
+
"key_1": "value_1",
259
+
"key_2": "value_2",
260
+
"key_3": "value_3"
261
+
},
262
+
"example_2": {
263
+
"key_4": "value_4",
264
+
"key_5": "value_5",
265
+
"key_6": "value_6"
266
+
}
267
+
}
268
+
}
269
+
```
270
+
242
271
### Functions
243
272
244
273
You can describe and pass in functions and the model will intelligently choose to output a JSON object containing arguments to call those them. For example, if you want the model to use your method `get_current_weather` to get the current weather in a given location:
0 commit comments