Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
RustEricLBuehler/candle-vllm
candle-vllm
Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.