Skip to content

Commit 0832fd5

Browse files
authored
Merge pull request #492 from marckohlbrugge/main
Add parameters support to batches.list and update README to include /v1/embeddings Batches support
2 parents 6098298 + 7d7dc6d commit 0832fd5

File tree

3 files changed

+25
-5
lines changed

3 files changed

+25
-5
lines changed

README.md

+1-3
Original file line numberDiff line numberDiff line change
@@ -526,11 +526,9 @@ puts response.dig("data", 0, "embedding")
526526
```
527527

528528
### Batches
529-
530-
The Batches endpoint allows you to create and manage large batches of API requests to run asynchronously. Currently, only the `/v1/chat/completions` endpoint is supported for batches.
529+
The Batches endpoint allows you to create and manage large batches of API requests to run asynchronously. Currently, the supported endpoints for batches are `/v1/chat/completions` (Chat Completions API) and `/v1/embeddings` (Embeddings API).
531530

532531
To use the Batches endpoint, you need to first upload a JSONL file containing the batch requests using the Files endpoint. The file must be uploaded with the purpose set to `batch`. Each line in the JSONL file represents a single request and should have the following format:
533-
534532
```json
535533
{
536534
"custom_id": "request-1",

lib/openai/batches.rb

+2-2
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@ def initialize(client:)
44
@client = client.beta(assistants: OpenAI::Assistants::BETA_VERSION)
55
end
66

7-
def list
8-
@client.get(path: "/batches")
7+
def list(parameters: {})
8+
@client.get(path: "/batches", parameters: parameters)
99
end
1010

1111
def retrieve(id:)

spec/openai/client/batches_spec.rb

+22
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,17 @@
2424
describe "#list", :vcr do
2525
let(:input_cassette) { "batches list input" }
2626
let(:cassette) { "batches list" }
27+
let(:limit) { 10 }
2728
let(:response) { OpenAI::Client.new.batches.list }
29+
let(:existing_batch) do
30+
OpenAI::Client.new.batches.create(
31+
parameters: {
32+
input_file_id: input_file_id,
33+
endpoint: "/v1/chat/completions",
34+
completion_window: "24h"
35+
}
36+
)
37+
end
2838

2939
before { batch_id }
3040

@@ -33,6 +43,18 @@
3343
expect(response.dig("data", 0, "object")).to eq("batch")
3444
end
3545
end
46+
47+
it "supports after and limit parameters" do
48+
VCR.use_cassette(cassette) do
49+
response = OpenAI::Client.new.batches.list(parameters: {
50+
after: existing_batch["id"],
51+
limit: limit
52+
})
53+
54+
expect(response.dig("data", 0, "object")).to eq("batch")
55+
expect(response["data"].length).to be <= limit
56+
end
57+
end
3658
end
3759

3860
describe "#retrieve" do

0 commit comments

Comments
 (0)