Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
PythonBlaizzy/mlx-vlm

mlx-vlm

MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.

83.7/100
4.5KForks: 482
View on GitHub
Loading report...

Similar Projects

omlx

88

LLM inference server with continuous batching & SSD caching for Apple Silicon — managed from the macOS menu bar

Python11.0K

vllm-mlx

77

OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.

Python917

LLaVA-OneVision-1.5

51

Fully Open Framework for Democratized Multimodal Training

Python794

VLMEvalKit

82

Open-source evaluation toolkit of large multi-modality models (LMMs), support 220+ LMMs, 80+ benchmarks

Python4.1K
Back to List