Skip to content

Commit

Permalink
Add Anthropic Models to Cache Prompt (All-Hands-AI#3775)
Browse files Browse the repository at this point in the history
* Add Anthropic Models to Cache Prompt

* Update Cache Prompt Active Check for Partial String Matching
  • Loading branch information
ColeMurray authored Sep 8, 2024
1 parent ab38515 commit dadada1
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions openhands/llm/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -478,9 +478,8 @@ def is_caching_prompt_active(self) -> bool:
Returns:
boolean: True if prompt caching is active for the given model.
"""
return (
self.config.caching_prompt is True
and self.config.model in cache_prompting_supported_models
return self.config.caching_prompt is True and any(
model in self.config.model for model in cache_prompting_supported_models
)

def _post_completion(self, response) -> None:
Expand Down

0 comments on commit dadada1

Please sign in to comment.