-
-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Description
LocalAI version:
docker master-aio-cpu
Environment, CPU architecture, OS, and Version:
mac m4
Describe the bug
Unable to have a normal conversation
To Reproduce
Expected behavior
Normal conversation
Logs
Feb 04 09:27:56 INFO BackendLoader starting modelID="z-image-Q2_K.gguf" backend="llama-cpp" model="z-image-Q2_K.gguf"
Feb 04 09:27:58 ERROR Failed to load model modelID="z-image-Q2_K.gguf" error=failed to load model with internal loader: could not load model: rpc error: code = Internal desc = Failed to load model: /models/z-image-Q2_K.gguf. Error: llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'lumina2'; llama_model_load_from_file_impl: failed to load model; llama_params_fit: encountered an error while trying to fit params to free device memory: failed to load model; llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'lumina2'; llama_model_load_from_file_impl: failed to load model backend="llama-cpp"
Feb 04 09:27:58 ERROR Stream ended with error error=failed to load model with internal loader: could not load model: rpc error: code = Internal desc = Failed to load model: /models/z-image-Q2_K.gguf. Error: llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'lumina2'; llama_model_load_from_file_impl: failed to load model; llama_params_fit: encountered an error while trying to fit params to free device memory: failed to load model; llama_model_load: error loading model: error loading model architecture: unknown model architecture: 'lumina2'; llama_model_load_from_file_impl: failed to load model
Additional context