Replies: 6 comments
-
|
Any suggestions? |
Beta Was this translation helpful? Give feedback.
-
|
Maybe anyone knows the answer? |
Beta Was this translation helpful? Give feedback.
-
|
Why don't these command work? Output in CLI: While this works on the same server: |
Beta Was this translation helpful? Give feedback.
-
|
Are you looking for this? |
Beta Was this translation helpful? Give feedback.
-
|
Not at all |
Beta Was this translation helpful? Give feedback.
-
|
git clone https://github.com/ggml-org/llama.cpp.git in llama.cpp folder after cmake (follow the instruction to build on your system) build/bin/llama-server --version llama-server --help ask an api end point information about the system where it is hosted looks like a vulnerability |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
For example the CLI command for Nginx:
nginx -VFor Ollama:
ollama -vI know that I can do this:
curl -i localhost:8080/healthBut it isn't what I find
May be some
llama-server -healthcommand is availableBeta Was this translation helpful? Give feedback.
All reactions