OpenLMLab / MOSS_Vortex
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated 2 years ago
Alternatives and similar repositories for MOSS_Vortex:
Users that are interested in MOSS_Vortex are comparing it to the libraries listed below
- ☆172Updated 2 years ago
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆236Updated last year
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated last year
- Light local website for displaying performances from different chat models.☆86Updated last year
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated 2 years ago
- ☆128Updated last year
- MEASURING MASSIVE MULTITASK CHINESE UNDERSTANDING☆87Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- 旨在对当前主流LLM进行一个直观、具体、标准的评测☆94Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆84Updated 2 years ago
- moss chat finetuning☆50Updated last year
- A unified tokenization tool for Images, Chinese and English.☆152Updated 2 years ago
- A more efficient GLM implementation!☆55Updated 2 years ago
- OPD: Chinese Open-Domain Pre-trained Dialogue Model☆75Updated last year
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆49Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated last year
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- llama inference for tencentpretrain☆98Updated last year
- ☆160Updated 2 years ago
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆65Updated 2 years ago
- 中文大语言模型评测第一期☆108Updated last year
- ☆59Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- 中文大语言模型评测第二期☆70Updated last year
- ☆124Updated last year
- ☆97Updated last year
- MultilingualShareGPT, the free multi-language corpus for LLM training☆72Updated 2 years ago
- Gaokao Benchmark for AI☆108Updated 2 years ago
- 中文图书语料MD5链接☆217Updated last year