feat(ai): Honor local-only AI policy setting and fix model persistence #1971
      
        
          +21
        
        
          −7
        
        
          
        
      
    
  
  Add this suggestion to a batch that can be applied as a single commit.
  This suggestion is invalid because no changes were made to the code.
  Suggestions cannot be applied while the pull request is closed.
  Suggestions cannot be applied while viewing a subset of changes.
  Only one suggestion per line can be applied in a batch.
  Add this suggestion to a batch that can be applied as a single commit.
  Applying suggestions on deleted lines is not supported.
  You must change the existing code in this line in order to create a valid suggestion.
  Outdated suggestions cannot be applied.
  This suggestion has been applied or marked resolved.
  Suggestions cannot be applied from pending reviews.
  Suggestions cannot be applied on multi-line comments.
  Suggestions cannot be applied while the pull request is queued to merge.
  Suggestion cannot be applied right now. Please check back later.
  
    
  
    
Summary
Warning
I'm not a very talented programmer.
The majority of this PR was authored by Claude Code.
I've reviewed this code to the best of my human ability.
This change has been working for me for about a week.
Changes
filteredModelListproperty that filters to localhost endpoints when policy requires local-onlycurrentModelIdto be reactive toPersistent.stateschanges and validate against filtered listsetModel()function to validate against filtered model list instead of full listTest plan
-- Note: fallback doesn't occur until shell is restarted since the Ollama model list is cached at startup, see issue Installed Ollama models don’t load in Quickshell until manual reload #1803
🤖 Generated with Claude Code