OpenLMLab / MOSS_VortexLinks
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated 2 years ago
Alternatives and similar repositories for MOSS_Vortex
Users that are interested in MOSS_Vortex are comparing it to the libraries listed below
Sorting:
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆238Updated 2 years ago
- deep learning☆148Updated 5 months ago
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆67Updated 2 years ago
- llama inference for tencentpretrain☆99Updated 2 years ago
- ☆172Updated 2 years ago
- moss chat finetuning☆51Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- [EMNLP 2023] Lion: Adversarial Distillation of Proprietary Large Language Models☆211Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated last year
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆92Updated 2 years ago
- ☆127Updated 2 years ago
- Imitate OpenAI with Local Models☆88Updated last year
- ☆308Updated 2 years ago
- Summarize all open source Large Languages Models and low-cost replication methods for Chatgpt.☆137Updated 2 years ago
- ☆163Updated 2 years ago
- A unified tokenization tool for Images, Chinese and English.☆151Updated 2 years ago
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated 2 years ago
- 实现一种多Lora权值集成切换+Zero-Finetune零微调增强的跨模型技术方案,LLM-Base+LLM-X+Alpaca,初期,LLM-Base为Chatglm6B底座模型,LLM-X是LLAMA增强模型。该方案简易高效,目标是使此类语言模型能够低能耗广泛部署,并最…☆117Updated 2 years ago
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆84Updated 2 years ago
- A more efficient GLM implementation!☆54Updated 2 years ago
- Kanchil(鼷鹿)是世界上最小的偶蹄目动物,这个开源项目意在探索小模型(6B以下)是否也能具备和人类偏好对齐的能力。☆113Updated 2 years ago
- alpaca中文指令微调数据集☆395Updated 2 years ago
- ☆123Updated last year
- code for piccolo embedding model from SenseTime☆140Updated last year
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆265Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆140Updated last year
- train llama on a single A100 80G node using 🤗 transformers and 🚀 Deepspeed Pipeline Parallelism☆224Updated last year
- 中文通用大模型开放域多轮测评基准 | An Open Domain Benchmark for Foundation Models in Chinese☆80Updated 2 years ago
- 大语言模型指令调优工具(支持 FlashAttention)☆178Updated last year
- 中文图书语料MD5链接☆217Updated last year