-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Description
Error Details
Model: Qwen 2.5 Coder 32B (Tunnel)
Provider: ollama
Status Code: N/A
Error Output
request to http://127.0.0.1:11435/api/chat failed, reason: connect EADDRNOTAVAIL 127.0.0.1:11435
Additional Context
Please add any additional context about the error here
Continue in Cursor on Windows fails to connect to local Ollama over SSH tunnel (EADDRNOTAVAIL 127.0.0.1:11435)
Body
I’m using Continue inside Cursor on Windows with a local Ollama endpoint exposed through an SSH tunnel.
Continue fails with:
request to http://127.0.0.1:11435/api/chat failed, reason: connect EADDRNOTAVAIL 127.0.0.1:11435
What works
Ollama works on the remote Linux host
The SSH tunnel is active
http://127.0.0.1:11435/api/tags works from PowerShell
The same endpoint also works from Node on the same Windows machine
Continue config
name: Local Config
version: 1.0.0
schema: v1
models:
- name: Tunnel Ollama
provider: ollama
model: qwen2.5-coder:32b
apiBase: http://127.0.0.1:11435
requestOptions:
noProxy:
- localhost
- 127.0.0.1
timeout: 30000
Expected
Continue should connect to the local Ollama API through the SSH tunnel and return a response.
Actual
Continue in Cursor throws EADDRNOTAVAIL, even though the same localhost endpoint is reachable outside the extension.
Question
Is this a known issue with:
Continue inside Cursor vs VS Code
Windows localhost handling
proxy handling for local Ollama endpoints
Metadata
Metadata
Assignees
Labels
Type
Projects
Status