中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
A Python library for extracting structured information from unstructured text using LLMs with precise source grounding and interactive visualization.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
中文羊驼大模型三期项目 (Chinese Llama-3 LLMs) developed from Meta Llama 3