Skip to content

Conversation

@zhao-oai
Copy link
Collaborator

@zhao-oai zhao-oai commented Oct 27, 2025

fixes: #5775

previously, model slugs were being mismatched due to formatting errors, resulting in important metadata (context window) being missing. for example, while running codex with gpt-oss-20b on ollama, the model slug would be: gpt-oss:20b. Codex expected the slug gpt-oss-20b

this PR fixes the issue by normalizing model slugs. while this is an imperfect fix, it should allow most models on different providers to work with codex correctly.

@zhao-oai zhao-oai changed the title normalizing model slug in get_model_info fix: normalizing model slug in get_model_info Oct 27, 2025
@zhao-oai
Copy link
Collaborator Author

@codex review

@chatgpt-codex-connector
Copy link
Contributor

Codex Review: Didn't find any major issues. Keep it up!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@zhao-oai
Copy link
Collaborator Author

@codex review

1 similar comment
@zhao-oai
Copy link
Collaborator Author

@codex review

@chatgpt-codex-connector
Copy link
Contributor

Codex Review: Didn't find any major issues. Can't wait for the next one!

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

stuck at "100% context left" for local models

2 participants