Supporting openai-api-compatible endpoints like the `/v1/chat/completions` and `/v1/completions` APIs would have the following benefits: * Allow Jetstream to be used as a drop-in replacement for the vLLM server * Make it easier to demo Jetstream with OSS chat UIs (i.e. https://github.com/open-webui/open-webui, https://github.com/danny-avila/LibreChat, etc)
Supporting openai-api-compatible endpoints like the
/v1/chat/completionsand/v1/completionsAPIs would have the following benefits: