THUDM / SwissArmyTransformer
SwissArmyTransformer is a flexible and powerful library to develop your own Transformer variants.
☆1,044Updated 3 weeks ago
Alternatives and similar repositories for SwissArmyTransformer:
Users that are interested in SwissArmyTransformer are comparing it to the libraries listed below
- Official implementation of paper "MiniGPT-5: Interleaved Vision-and-Language Generation via Generative Vokens"☆858Updated last month
- Open Academic Research on Improving LLaMA to SOTA LLM☆1,615Updated last year
- Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo☆1,054Updated 5 months ago
- [NIPS2023] RRHF & Wombat☆802Updated last year
- DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models☆1,083Updated last year
- ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)☆908Updated last month
- Rotary Transformer☆858Updated 2 years ago
- A plug-and-play library for parameter-efficient-tuning (Delta Tuning)☆1,009Updated 3 months ago
- Code and models for the paper "One Transformer Fits All Distributions in Multi-Modal Diffusion"☆1,394Updated last year
- LOMO: LOw-Memory Optimization☆979Updated 6 months ago
- Secrets of RLHF in Large Language Models Part I: PPO☆1,320Updated 10 months ago
- ☆900Updated last year
- Efficient Training (including pre-training and fine-tuning) for Big Models☆574Updated 5 months ago
- Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"☆1,108Updated 10 months ago
- Collaborative Training of Large Language Models in an Efficient Way☆411Updated 4 months ago
- ☆894Updated 7 months ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,354Updated 9 months ago
- Emu Series: Generative Multimodal Models from BAAI☆1,673Updated 3 months ago
- Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch☆632Updated 3 weeks ago
- Next-Token Prediction is All You Need☆1,965Updated 2 months ago
- A family of lightweight multimodal models.☆972Updated last month
- [NeurIPS 2024] SimPO: Simple Preference Optimization with a Reference-Free Reward☆800Updated 2 months ago
- real Transformer TeraFLOPS on various GPUs☆892Updated last year
- Rectified Rotary Position Embeddings☆348Updated 7 months ago
- X-LLM: Bootstrapping Advanced Large Language Models by Treating Multi-Modalities as Foreign Languages☆307Updated last year
- mPLUG-Owl: The Powerful Multi-modal Large Language Model Family☆2,393Updated last month
- The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.☆440Updated 3 months ago
- Safe RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback☆1,394Updated 7 months ago
- ☆756Updated 6 months ago
- 更纯粹、更高压缩率的Tokenizer☆468Updated last month