Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
Pythonwaybarrios/vllm-mlx

vllm-mlx

OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.

60.0/100
531Forks: 71
View on GitHub
Loading report...

Similar Projects

Open-Interface

76

Control Any Computer Using LLMs.

Python2.6K

mlx-vlm

82

MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.

Python2.2K

pixeltable

87

Data Infrastructure providing a declarative, incremental approach for multimodal AI workloads.

Python1.6K

vllm

93

A high-throughput and memory-efficient inference and serving engine for LLMs

Python72.4K
Back to List