OpenLMLab / MOSS_Vortex
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated last year
Alternatives and similar repositories for MOSS_Vortex:
Users that are interested in MOSS_Vortex are comparing it to the libraries listed below
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated last year
- ☆173Updated last year
- llama inference for tencentpretrain☆96Updated last year
- A unified tokenization tool for Images, Chinese and English.☆151Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆71Updated last year
- moss chat finetuning☆50Updated 8 months ago
- Light local website for displaying performances from different chat models.☆85Updated last year
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆56Updated 2 months ago
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆48Updated last year
- 文本去重☆68Updated 7 months ago
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆233Updated last year
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆127Updated last month
- 中文 Instruction tuning datasets☆124Updated 9 months ago
- ☆159Updated last year
- A more efficient GLM implementation!☆55Updated last year
- ☆125Updated last year
- code for piccolo embedding model from SenseTime☆117Updated 7 months ago
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆86Updated last year
- 大语言模型指令调优工具(支持 FlashAttention)☆168Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- 中文原生检索增强生成测评基准☆105Updated 9 months ago
- ☆93Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆133Updated 9 months ago
- 中文大语言模型评测第一期☆108Updated last year
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆64Updated last year
- 使用qlora对中文大语言模型进行微调,包含ChatGLM、Chinese-LLaMA-Alpaca、BELLE☆85Updated last year
- 中文图书语料MD5链接☆213Updated 11 months ago
- SuperCLUE-Agent: 基于中文原生任务的Agent智能体核心能力测评基准☆79Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year