Feature Request: Ollama Cloud API Support #3385
piknockyou
started this conversation in
1. Feature requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Feature Request: Ollama Cloud API Support
Summary: Native support for Ollama Cloud remote models via API key (no local proxy needed).
Problem: Ollama provider limited to local/self-hosted. Cloud (Sep 2025) enables large models (e.g.,
gpt-oss:120b-cloud) viahttps://ollama.com/api+ key, but not configurable.Benefits: Access 100B+ models easily.
Context: Ollama Docs
Beta Was this translation helpful? Give feedback.
All reactions