We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am trying to run two local LLMs to act as an architect and a coder. I am using macOS and mlx package:
Terminal1 Editor mlx_lm.server --model mlx-community/Qwen2.5-Coder-14B-Instruct-4bit
mlx_lm.server --model mlx-community/Qwen2.5-Coder-14B-Instruct-4bit
Terminal2 Architect mlx_lm.server --model mlx-community/Qwen_QwQ-32B-Preview_MLX-4bit --port 8081
mlx_lm.server --model mlx-community/Qwen_QwQ-32B-Preview_MLX-4bit --port 8081
Terminal3 Aider aider --openai-api-base http://127.0.0.1:8080/v1 --openai-api-key secret --model openai/http://localhost:8081/v1 --editor-model openai/http://localhost:8080/v1 --architect
aider --openai-api-base http://127.0.0.1:8080/v1 --openai-api-key secret --model openai/http://localhost:8081/v1 --editor-model openai/http://localhost:8080/v1 --architect
The error I get is the following litellm.NotFoundError: NotFoundError: OpenAIException - Not Found
litellm.NotFoundError: NotFoundError: OpenAIException - Not Found
Aider: 0.76.2 Model: Qwen2.5
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Issue
I am trying to run two local LLMs to act as an architect and a coder. I am using macOS and mlx package:
Terminal1 Editor
mlx_lm.server --model mlx-community/Qwen2.5-Coder-14B-Instruct-4bit
Terminal2 Architect
mlx_lm.server --model mlx-community/Qwen_QwQ-32B-Preview_MLX-4bit --port 8081
Terminal3 Aider
aider --openai-api-base http://127.0.0.1:8080/v1 --openai-api-key secret --model openai/http://localhost:8081/v1 --editor-model openai/http://localhost:8080/v1 --architect
The error I get is the following
litellm.NotFoundError: NotFoundError: OpenAIException - Not Found
Version and model info
Aider: 0.76.2
Model: Qwen2.5
The text was updated successfully, but these errors were encountered: