Skip to content

sofatutor/openai-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

openai_proxy

Minimal Ruby OpenAI proxy focused on the core pieces required to replace LiteLLM for OpenAI-only routing.

Supported use case: transparent OpenAI proxying with MySQL-backed project and token storage, Redis for hot-path caching and async usage queues, off-path logging, and expiring tokens.

  • OpenAI-compatible transparent proxy on /v1/*
  • Project management with one OpenAI API key per project
  • Short-lived token minting on /tokens/mint
  • Token validation backed by MySQL with Redis hot-path caching
  • Redis-backed HTTP response caching for cacheable OpenAI requests
  • Usage logging shipped off the request path through Redis to CloudWatch Logs
  • No admin UI and no benchmark suite

Scope

This is intentionally a minimum viable service:

  • Each project stores exactly one upstream OpenAI API key
  • Project API keys are encrypted at rest with AES-256-GCM
  • Token records carry project_id and free-form metadata for routing context and observability
  • CloudWatch shipping uses structured log events compatible with the current Go deployment shape

Not included:

  • Admin UI
  • Full management API beyond minimal project creation and listing
  • Benchmark harness

Endpoints

  • GET /projects
  • POST /projects
  • GET /health
  • POST /tokens/mint
  • ANY /v1/*

Environment

  • OPENAI_PROXY_UPSTREAM_BASE_URL default: https://api.openai.com
  • OPENAI_PROXY_MANAGEMENT_TOKEN required for /projects and /tokens/mint
  • OPENAI_PROXY_DATABASE_URL required, MySQL URL for Sequel
  • OPENAI_PROXY_ENCRYPTION_KEY required, base64-encoded 32-byte key for encrypting stored upstream API keys
  • OPENAI_PROXY_REDIS_URL required
  • OPENAI_PROXY_CLOUDWATCH_LOG_GROUP optional
  • OPENAI_PROXY_CLOUDWATCH_LOG_STREAM optional
  • OPENAI_PROXY_CLOUDWATCH_REGION optional
  • OPENAI_PROXY_USAGE_QUEUE_KEY default: openai_proxy:usage_events

Project Layout

  • lib/ runtime code
  • spec/ RSpec coverage
  • docs/ standalone project documentation
  • .github/ repo-local Copilot and extraction-ready CI templates

Run

bundle install
bundle exec puma -C config/puma.rb

Or:

make run

Docker

Build the image locally from the openai-proxy directory:

docker build -t openai-proxy:dev .
docker run --rm -p 8080:8080 --env-file .env openai-proxy:dev

Bring up the full local stack with MySQL and Redis:

docker compose up --build

GitHub Actions workflows for CI and image publishing live under .github/workflows/.

Default image target:

ghcr.io/sofatutor/openai-proxy

Schema setup

Apply db/schema.sql to the target MySQL database before boot.

Validation

export RBENV_VERSION=3.3.9
bundle exec rspec
make lint

Project create request

{
  "name": "search",
  "api_key": "sk-live-project-key"
}

Token mint request

{
  "project_id": "<project-uuid>",
  "ttl_seconds": 3600,
  "metadata": {
    "feature": "search",
    "user_id": "123"
  }
}

Additional docs

About

Minimal Ruby OpenAI proxy with project-scoped tokens, Redis hot-path support, and async observability

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors