Unable to connect to ollama server #10860
-
|
I have an ollama server that is accessible on localhost:11434. I am able to see the supported models via: I set this up in VSCode / Continue as follows: I then try to chat with the model in VSCode, but immediately get the following error: As I understand it, Ollama is serving http, but Continue seems to be trying to connect via https, despite the explicit 'http' in the config file. What am I doing wrong? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Tried to reproduce this on my machine and it seems to be working fine. Can you try changing your vscode proxy ( |
Beta Was this translation helpful? Give feedback.
Tried to reproduce this on my machine and it seems to be working fine.
Can you try changing your vscode proxy (
http.proxy) settings (or other related proxy settings) which might be interfering?