vLLM is a massive mess and an absolute hassle to deal with and I don't want to get back to that thing.When will this awesome sounding model be available for llama.cpp? π
Same to you, I hope this open model can be loaded with llama.cpp
Β· Sign up or log in to comment