Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
Cudaalibaba/rtp-llm

rtp-llm

RTP-LLM: Alibaba's high-performance LLM inference engine for diverse applications.

79.9/100
1.1KForks: 158
View on GitHub
Loading report...

Similar Projects

vllm

93

A high-throughput and memory-efficient inference and serving engine for LLMs

Python72.4K

vllm-ascend

77

Community maintained hardware plugin for vLLM on Ascend

Python1.7K

ZhiLight

73

A highly optimized LLM inference acceleration engine for Llama and its variants.

C++905

OpenLLM

89

Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.

Python12.1K
Back to List