OpenLMLab / MOSS_Vortex
Moss Vortex is a lightweight and high-performance deployment and inference backend engineered specifically for MOSS 003, providing a wealth of features aimed at enhancing performance and functionality, built upon the foundations of MOSEC and Torch.
☆37Updated last year
Alternatives and similar repositories for MOSS_Vortex:
Users that are interested in MOSS_Vortex are comparing it to the libraries listed below
- MOSS 003 WebSearchTool: A simple but reliable implementation☆45Updated last year
- ☆172Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆72Updated last year
- Light local website for displaying performances from different chat models.☆85Updated last year
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆65Updated 2 years ago
- moss chat finetuning☆50Updated 11 months ago
- A unified tokenization tool for Images, Chinese and English.☆151Updated 2 years ago
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆60Updated 5 months ago
- ☆160Updated last year
- ☆128Updated last year
- MEASURING MASSIVE MULTITASK CHINESE UNDERSTANDING☆87Updated last year
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated 2 years ago
- Chinese large language model base generated through incremental pre-training on Chinese datasets☆235Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated 11 months ago
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- llama inference for tencentpretrain☆98Updated last year
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- Gaokao Benchmark for AI☆108Updated 2 years ago
- A more efficient GLM implementation!☆55Updated 2 years ago
- Summarize all open source Large Languages Models and low-cost replication methods for Chatgpt.☆135Updated last year
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- Source code for ACL 2023 paper Decoder Tuning: Efficient Language Understanding as Decoding☆48Updated last year
- ☆59Updated last year
- Kanchil(鼷鹿)是世界上最小的偶蹄目动物,这个开源项目意在探索小模型(6B以下)是否也能具备和人类偏好对齐的能力。☆113Updated last year
- deep learning☆150Updated 2 weeks ago
- ☆124Updated last year
- 用于微调LLM的中文指令数据集☆27Updated last year
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆40Updated last year
- Silk Road will be the dataset zoo for Luotuo(骆驼). Luotuo is an open sourced Chinese-LLM project founded by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子…☆38Updated last year
- 文本去重☆69Updated 10 months ago