🐛 Bug
Recently, CUDA has updated to12. However, I'm encountering errors when attempting to install VLLM and llama.cpp using pip. Could there be an issue with my CUDA installation? For reference, the error messages are:


To Reproduce
Expected behavior
Additional context