SUSTech-IDEA / SUS-ChatLinks
SUS-Chat: Instruction tuning done right
☆49Updated last year
Alternatives and similar repositories for SUS-Chat
Users that are interested in SUS-Chat are comparing it to the libraries listed below
Sorting:
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆266Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- ☆235Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆139Updated last year
- Mixture-of-Experts (MoE) Language Model☆192Updated last year
- Imitate OpenAI with Local Models☆89Updated last year
- The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.☆446Updated last year
- LongQLoRA: Extent Context Length of LLMs Efficiently☆167Updated 2 years ago
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated 2 years ago
- Light local website for displaying performances from different chat models.☆87Updated 2 years ago
- 旨在对当前主流LLM进行一个直观、具体、标准的评测☆95Updated 2 years ago
- 文本去重☆77Updated last year
- ☆106Updated 2 years ago
- 大语言模型指令调优工具(支持 FlashAttention)☆178Updated last year
- code for piccolo embedding model from SenseTime