Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
C++Tencent/hpc-ops

hpc-ops

High Performance LLM Inference Operator Library

52.6/100
833Forks: 82
View on GitHub
Loading report...

Similar Projects

llama.cpp

85

LLM inference in C/C++

C++106.0K

whisper.cpp

85

Port of OpenAI's Whisper model in C/C++

C++49.0K

rocksdb

88

A library that provides an embeddable, persistent key-value store for fast storage.

C++31.6K

xiaozhi-esp32

86

An MCP-based chatbot | 一个基于MCP的聊天机器人

C++25.9K
Back to List