-
-
Notifications
You must be signed in to change notification settings - Fork 287
Add Azure OpenAI provider #279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hola @crmne. Before I go further, can you please take a look and let me know if I'm on the right track? I had to overload some methods to "steal" model ID and config properties because of the way Azure OpenAI API URLs get constructed using model and API version. |
@crmne, apologies if I am pestering you. I've been using the new provider with my project where I have tested I don't fully understand how / what I should do for the RSpec tests. Let me know how you would like me to proceed, or if you are comfortable with accepting this PR as is. |
Hi @oxaroky02 just add your models to the correct lists in |
@crmne I've configured |
Hmm, looks like I need to figure out the Postgres config for the commit hooks to work. OK, no worries, I'll work on that. |
@oxaroky02 what errors are you getting with PG? I may be able to help. This PR is necessary for us to utilize RubyLLM without patching. |
@oxaroky02 also, could we add a config for always assuming model exists? With Azure we have many models names and would love to just use the standard |
I tried running |
@oxaroky02 do you have a Mac? If so, simplest way to get PG running is installing https://postgresapp.com/. It'll give you what you need and then you can remove if you don't need it anymore. |
@t2, I have
I'm getting following exception:
I see the spec helper is loading a Questions:
I appreciate your patience and support with this. |
I'm not extremely familiar with Docker so I don't think I could help there. If you just install Postgress.app on your main computer I expect it should just work. |
@t2, no worries, I'll keep looking. I'm limited with what I can install on my work laptop. The DB is up and running and I'm able to connect to it etc. So now I'm trying to figure out where exactly the password can be specified for the tests to run. Were you able to try the new provider tho? Did it work for you? I'll look into your other question soon, I'm still figuring out how the innards of |
Looks like I can set the default for |
I just checked out your branch and tested and IT WORKED 🤘🏼! Once you get past the tests this will be great for the folks on Azure OpenAI. |
DELETING (I figured out the problem. There was something going on with the default |
07ffcf8
to
9b98721
Compare
The tests are running properly now. I have added models listing support now and I've updated @crmne, when I run the rake task it ends up making a lot of changes to the One thing I can do (which I tried for fun) is take the new entry and stuff it into the original BUT this kinda goes against the constraint that I shouldn't modify the file manually. How would you like me to proceed? |
@crmne given that adding this provider only adds Azure specific models, is there a way to skip the models:update for any providers that aren't impacted? From what I am gathering, we must have valid ENV variables for all providers to refresh the models list and without those overcommit will fail to allow commits. |
@oxaroky02 are you planning to finish this ? |
Hola @undersky0, sorry for the delay. Last time I worked on this I was stuck (earlier question quoted below) and then my life got super busy. 😄 I can try and finish this, or I can also hand this over to someone else who has more time in the short term. Let me at least see if I can get the fork rebased, I see a lot of commits.
|
d0a2490
to
f651c86
Compare
@crmne or anyone else who knows 😄 I have rebased and fixed my old code to match all the changes (wow!) since I last looked at this. None of the issues I had before exist now and I was able to move some hooks and less files etc. 👍 All the tests that pertain to Azure Open AI are passing with the exception of four (error response-related) which I have marked for skipping (for now.) However there are three failing tests as follows:
All three are failing for the same reason, so here's one of them. (Click to reveal.)
What do I need to do for these? Any help is appreciated. Once I resolve these I have a working PR again to commit. |
@oxaroky02 Do you have Claude Code or Codex? I just had Claude Code fix the errors but I don't have access to the repo to push a branch. The broken tests are referencing Ollama, is that by design? |
module Providers | ||
# Azure OpenAI API integration. Derived from OpenAI integration to support | ||
# OpenAI capabilities via Microsoft Azure endpoints. | ||
module AzureOpenAI |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think about being consistent with other existing providers, like deepseek does:
module RubyLLM
module Providers
# Azure OpenAI API integration. Derived from OpenAI integration to support
# OpenAI capabilities via Microsoft Azure endpoints.
class AzureOpenAI < OpenAI
include AzureOpenAI::Chat
include AzureOpenAI::Streaming
include AzureOpenAI::Models
def api_base
# https://<ENDPOINT>/openai/deployments/<MODEL>/chat/completions?api-version=<APIVERSION>
"#{@config.azure_openai_api_base}/openai"
end
Great work by the way!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made this implementation locally, and all the tests are green. Let me know what's the best way to contribute to your PR and I'll share these changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hola @alexperto, thanks. I have made the changes to reflect the use of class
vs module
. The problem I'm facing right now is I can't commit them as the pre-commit hook fails. I suppose I could commit without the pre-commit checking knowing the PR will fail its tests ... but I don't want to be rude. 😄
Essentially, the code everyone is seeing is out of date.
@t2, first I should point out the errors I'm seeing are after a number of changes I've made which I am unable to commit. If there's a way to commit knowing the PR will fail its checks, let me know, as that might make it easier for you and others to reproduce the issue. Essentially, the code everyone is seeing is out of date. One of the tests is referring to Ollama and local models, however all three have one thing in common: Also, all three are failing because of the following error:
I have to admit I don't know how VCR work in this case; I assume some API call is being made but I can't figure out where or what. |
…y for unverified commit.
First I want to first apologize as I disabled the pre-commit hook and pushed by current changes into my branch for this PR. 😞 Sorry. I hope this will help reduce all the time you're all kindly wasting looking at my old code. Sorry again. As of now, all tests except 3 are passing. Here's the summary; use the link below to expand and see the detailed failure exception. The tests are all related to
Click to expand and see the detailed test failures.
|
also in need of azure support. let me know if you want another tester of the new provider. |
@eichert12 Thanks, and yes. Anyone who can test this would be helpful. My main issue though is that I don't know what to do about those test failures above. 😢 I am happy to invite someone who can help to my forked repo, or also for someone with access to modify PR branch. I am stuck. Sorry this has been blocked for so long. |
@oxaroky02 thanks for all of your handwork on this. I am not able to help on the VCR tests but I was able to get Azure going with the standard install but overriding a few items in my initializer. So, until this gets all fixed up, if you want to use something similar here it is: # config/initializers/ruby_llm.rb
# frozen_string_literal: true
module RubyLLM
class << self
def chat_context
@chat_context ||= context do |config|
config.openai_api_key = api_key
config.openai_api_base = chat_url
config.request_timeout = 180
end
end
def embedding_context
@embedding_context ||= context do |config|
config.openai_api_key = api_key
config.openai_api_base = embedding_url
config.request_timeout = 180
end
end
def api_key
dummy_env? ? '' : ENV.fetch('AZURE_OPENAI_API_KEY')
end
def chat_url
return '' if dummy_env?
"#{ENV.fetch('AZURE_OPENAI_BASE_URL')}/openai/deployments/" \
"#{ENV.fetch('OPENAI_MODEL', 'gpt-4o')}" \
"?api-version=#{ENV.fetch('AZURE_OPENAI_API_VERSION', '2024-12-01-preview')}"
end
def embedding_url
return '' if dummy_env?
"#{ENV.fetch('AZURE_OPENAI_BASE_URL')}/openai/deployments/" \
"#{ENV.fetch('OPENAI_EMBEDDING_MODEL', 'text-embedding-3-small')}" \
"?api-version=#{ENV.fetch('AZURE_OPENAI_API_VERSION', '2024-12-01-preview')}"
end
def dummy_env?
ENV['SECRET_KEY_BASE_DUMMY'].present?
end
def reset!
@chat_context = nil
@embedding_context = nil
end
end
end
module RubyLLM
module ContextDefaults
def chat(*, **opts)
opts[:provider] ||= :openai
opts[:context] ||= RubyLLM.chat_context
opts[:assume_model_exists] = true unless opts.key?(:assume_model_exists)
super(*, **opts)
end
end
end
RubyLLM.singleton_class.prepend(RubyLLM::ContextDefaults)
module RubyLLM
module EmbeddingDefaults
def embed(*, **opts)
opts[:provider] ||= :openai
opts[:context] ||= RubyLLM.embedding_context
opts[:assume_model_exists] = true unless opts.key?(:assume_model_exists)
super(*, **opts)
end
end
end
RubyLLM::Embedding.singleton_class.prepend(RubyLLM::EmbeddingDefaults)
RubyLLM.configure do |config|
config.openai_api_key = RubyLLM.api_key
config.openai_api_base = RubyLLM.chat_url
config.request_timeout = 180
end AZURE_OPENAI_BASE_URL="https://YOUR_ENDPOINT.openai.azure.com" And now you can use it just like the docs say: chat = RubyLLM.chat
response = chat.ask "What is Ruby?"
embedding = RubyLLM.embed "What is Ruby?"
puts embedding.vectors |
What this does
Adds Azure OpenAI provider, derived from the existing OpenAI provider.
Provider can be configured as follows:
Type of change
Scope check
Quality check
overcommit --install
and all hooks passmodels.json
,aliases.json
)API changes
Related issues