OpenLMLab / MOSS_Vortex
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated last year
Related projects ⓘ
Alternatives and complementary repositories for MOSS_Vortex
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆86Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆70Updated last year
- llama inference for tencentpretrain☆96Updated last year
- A more efficient GLM implementation!☆55Updated last year
- A unified tokenization tool for Images, Chinese and English.☆150Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆66Updated last year
- moss chat finetuning☆50Updated 6 months ago
- ☆173Updated last year
- Silk Road will be the dataset zoo for Luotuo(骆驼). Luotuo is an open sourced Chinese-LLM project founded by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子…☆37Updated last year
- 中文 Instruction tuning datasets☆118Updated 7 months ago
- ☆157Updated last year
- 文本去重☆67Updated 5 months ago
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆55Updated 3 weeks ago
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆48Updated last year
- SuperCLUE-Math6:新一代中文原生多轮多步数学推理数据集的探索之旅☆44Updated 9 months ago
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆89Updated last year
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆234Updated last year
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆106Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆121Updated 10 months ago
- ☆74Updated last year
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆46Updated last year
- Kanchil(鼷鹿)是世界上最小的偶蹄目动物,这个开源项目意在探索小模型(6B以下)是否也能具备和人类偏好对 齐的能力。☆114Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆132Updated 7 months ago
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆38Updated 8 months ago
- 怎么训练一个LLM分词器☆129Updated last year
- ☆82Updated last year
- deep learning☆149Updated 4 months ago
- code for piccolo embedding model from SenseTime☆106Updated 5 months ago