Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
TypeScriptngxson/wllama
wllama
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference