Replies: 2 comments 2 replies
-
Just to cross check, does this happen with other models like GPT-5 or Sonnet-4? Are you able to replicate? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Well, after some testing it seems there is a problem with gpt-oss-120b tool handling. gpt-oss-20b seems to have the same problem. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
recently, I setup a local headless Ubuntu server with llama.cpp running llama-server in order to be able to use it in my whole local network.
Everything works fine so far, however when using Continue in VS Code and trying to work on my project files, Continue always tries to open empty file paths, which of course fails.
This is the config.yaml in my project workspace:
`name: Local Assistant
version: 1.0.0
schema: v1
models:
provider: openai
model: gpt-oss-120b
apiBase: http://192.168.1.8:5000/v1
roles:
parameters:
temperature: 1.0
top_p: 1.0
top_k: 0
max_context_length: 131072
context:
This is the error I get:
read_file failed with the message:
filepath` argument is required and must not be empty or whitespace-only. (type string)Please try something else or request further instructions.`
Continue doesn't stop with one of those, it tries over and over in an endless loop to call an empty path.
Beta Was this translation helpful? Give feedback.
All reactions