Back to List
Notice:This resource is provided by a third-party author. Please review the code with AI tools or manually before use to ensure security and compatibility.
TypeScriptngxson/wllama

wllama

WebAssembly binding for llama.cpp - Enabling on-browser LLM inference

67.3/100
1.0KForks: 73
View on GitHubHomepage →
Loading report...

Similar Projects

jan

91

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer.

TypeScript40.9K

LlamaIndexTS

84

Data framework for your LLM applications. Focus on server side solution

TypeScript3.1K

node-llama-cpp

89

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

TypeScript1.9K

BrowserAI

83

Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser

TypeScript1.4K
Back to List