Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ollama] ERROR: Non-retryable error occurred: 404 page not found #182

Open
mdzidic opened this issue Feb 4, 2025 · 4 comments
Open

[Ollama] ERROR: Non-retryable error occurred: 404 page not found #182

mdzidic opened this issue Feb 4, 2025 · 4 comments

Comments

@mdzidic
Copy link

mdzidic commented Feb 4, 2025

Description:

I'm running ComfyUI inside a Docker container along with an Ollama local server. I have edited urls.json, and the models load correctly. However, when I attempt to generate an enhanced prompt, I receive the following error from the "Troubleshooting" section:


➤ Begin Log for: Advanced Prompt Enhancer, Node #6:
✦ INFO: Additional parameters input: []
✦ INFO: Setting client to OpenAI Open Source LLM object
✦ INFO: Maximum tries set to: 3
✦ ERROR: Non-retryable error occurred: 404 page not found
✦ ERROR: Request failed: 404 page not found

➤ Begin Log for: Ollama Unload Model Setting:
✦ INFO: URL was validated and is being presented as: http://host.docker.internal:11434/api/generate
✦ INFO: Attempting to set model TTL using URL: http://host.docker.internal:11434/api/generate
✦ INFO: Model unload setting successful.  Response: {"model":"llama3.1:latest","created_at":"2025-02-04T13:17:00.393244595Z","response":"","done":true,"done_reason":"load"}

Expected Behavior:

The enhanced prompt should generate successfully without errors.

Additional Information:

  • ComfyUI version: 0.3.13
  • Ollama version: 0.3.9
@glibsonoran
Copy link
Owner

glibsonoran commented Feb 4, 2025 via email

@glibsonoran
Copy link
Owner

glibsonoran commented Feb 4, 2025

You can also try running it with the http.../generate url using the AI_service: Direct Web Connection (URL)

Ollama: http://host.internal:11434/v1
Direct Web: http://host.internal:11434/v1/chat/completions

@mdzidic
Copy link
Author

mdzidic commented Feb 4, 2025

Ollama URL .../v1 did a trick, thanks!

@glibsonoran
Copy link
Owner

Good news, enjoy! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants