Skip to content

JuliaML/OpenAI.jl

Repository files navigation

OpenAI API wrapper for Julia (Unofficial)

Stable In Development

Overview

Provides a community maintained Julia wrapper to the OpenAI API. For API functionality see reference documentation. Autogenerated documentation can be found here: https://juliaml.github.io/OpenAI.jl/dev/

Usage

using Pkg; Pkg.add("OpenAI")

Quick Start

  1. Create an OpenAI account, if you don't already have one

  2. Create a secret API key

  3. Choose a model to interact with

⚠️ We strongly suggest setting up your API key as an ENV variable.

secret_key = ENV["OPENAI_API_KEY"]
model = "gpt-5-mini"
prompt =  "Say \"this is a test\""

r = create_chat(
    secret_key,
    model,
    [Dict("role" => "user", "content"=> prompt)]
  )
println(r.response[:choices][begin][:message][:content])

returns

"This is a test."

Overriding default parameters

If you have a non-standard setup, such as a local LLM or third-party service that conforms to the OpenAI interface, you can override parameters using the OpenAIProvider struct in your application like this:

using OpenAI
provider = OpenAI.OpenAIProvider(
    api_key=ENV["OPENAI_API_KEY"],
    base_url=ENV["OPENAI_BASE_URL_OVERRIDE"]
)
response = create_chat(
    provider,
    "gpt-5-mini",
    [Dict("role" => "user", "content" => "Write some ancient Greek poetry")]
)

For more use cases see tests.

Streaming with StreamCallbacks

OpenAI.jl integrates StreamCallbacks.jl for streaming responses.

1. Stream to any IO

create_chat(secret_key, model, messages; streamcallback=stdout)

2. Capture stream chunks

using OpenAI
cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)
cb.chunks

3. Customize printing

using OpenAI
import StreamCallbacks: print_content

function print_content(io::IO, content; kwargs...)
    printstyled(io, "🌊 $content"; color=:cyan)
end

cb = StreamCallback()
create_chat(secret_key, model, messages; streamcallback=cb)

To fully customize processing, you can overload StreamCallbacks.callback:

using OpenAI
import StreamCallbacks: callback, AbstractStreamCallback, AbstractStreamChunk, extract_content, print_content

@inline function callback(cb::AbstractStreamCallback, chunk::AbstractStreamChunk; kwargs...)
    processed_text = extract_content(cb.flavor, chunk; kwargs...)
    isnothing(processed_text) && return nothing
    print_content(cb.out, processed_text; kwargs...)
    return nothing
end

See examples/streamcallbacks.jl for a full walkthrough.

Feature requests

Feel free to open a PR, or file an issue if that's out of reach!

Notes