Skip to content

Fix: Handle missing prompt_tokens_details in OpenAI-compatible APIs #83

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 26, 2025

Conversation

rishsriv
Copy link
Member

Summary

  • Add checks for existence of prompt_tokens_details in OpenAI response objects before accessing properties
  • Fix TypeError when using third-party OpenAI-compatible APIs that don't include cache-related information

Test plan

  • The fix adds proper null checking before accessing cached_tokens property
  • When prompt_tokens_details is missing, all tokens will be treated as uncached

Fixes #82

🤖 Generated with Claude Code

rishsriv and others added 3 commits March 26, 2025 17:55
Add checks for the existence of prompt_tokens_details in response objects
before attempting to access it. This fixes an issue when using third-party
OpenAI-compatible APIs that don't include cache-related information.

Fixes #82.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
@rishsriv rishsriv merged commit 910931a into main Mar 26, 2025
2 checks passed
@rishsriv rishsriv deleted the fix-third-party-openai-api branch March 26, 2025 10:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add a check for the existence of prompt_tokens_details:
1 participant