OpenLMLab / MOSS_Vortex
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated last year
Alternatives and similar repositories for MOSS_Vortex:
Users that are interested in MOSS_Vortex are comparing it to the libraries listed below
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated last year
- ☆159Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆85Updated last year
- ☆173Updated last year
- moss chat finetuning☆50Updated 9 months ago
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎, 聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated last year
- deep learning☆150Updated 7 months ago
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆48Updated last year
- A more efficient GLM implementation!☆55Updated 2 years ago
- code for Scaling Laws of RoPE-based Extrapolation☆70Updated last year
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆38Updated 11 months ago
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆64Updated last year
- Light local website for displaying performances from different chat models.☆85Updated last year
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆105Updated last year
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆235Updated last year
- llama inference for tencentpretrain☆97Updated last year
- 使用qlora对中文大语言模型进行微调,包含ChatGLM、Chinese-LLaMA-Alpaca、BELLE☆85Updated last year
- 怎么训练一个LLM分词器☆140Updated last year
- ☆121Updated last year
- 中文大语言模型评测第二期☆70Updated last year
- ☆96Updated 11 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆129Updated 2 months ago
- 文本去重☆68Updated 8 months ago
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆74Updated 2 years ago
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- ☆125Updated last year
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- SuperCLUE-Math6:新一代中文原生多轮多步数学推理数据集的探索之旅☆53Updated last year
- ☆94Updated last year
- Clustering and Ranking: Diversity-preserved Instruction Selection through Expert-aligned Quality Estimation☆74Updated 3 months ago