Rotary Transformer
☆1,078Mar 21, 2022Updated 3 years ago
Alternatives and similar repositories for roformer
Users that are interested in roformer are comparing it to the libraries listed below
Sorting:
- RoFormer V1 & V2 pytorch☆519May 18, 2022Updated 3 years ago
- Fast and memory-efficient exact attention☆22,460Updated this week
- Rectified Rotary Position Embeddings☆389May 20, 2024Updated last year
- Ongoing research training transformer models at scale☆15,461Updated this week
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821☆3,644Oct 16, 2024Updated last year
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆22,030Jan 23, 2026Updated last month
- a bert for retrieval and generation☆859Feb 26, 2021Updated 5 years ago
- SimBERT升级版(SimBERTv2)!☆444Mar 21, 2022Updated 3 years ago
- keras implement of transformers for humans☆5,424Nov 11, 2024Updated last year
- Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.☆3,156Jan 22, 2024Updated 2 years ago
- 基于Gated Attention Unit的Transformer模型(尝鲜版)☆98Feb 24, 2023Updated 3 years ago
- An Easy-to-use, Scalable and High-performance Agentic RL Framework based on Ray (PPO & DAPO & REINFORCE++ & TIS & vLLM & Ray & Async RL)☆9,084Updated this week
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆20,717Updated this week
- GLM (General Language Model)☆3,442Nov 3, 2023Updated 2 years ago
- Example models using DeepSpeed☆6,791Feb 7, 2026Updated 3 weeks ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,489Jan 14, 2026Updated last month
- Transformer related optimization, including BERT, GPT☆6,398Mar 27, 2024Updated last year
- A python library for highly configurable transformers - easing model architecture search and experimentation.☆48Nov 30, 2021Updated 4 years ago
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆13,299Dec 17, 2024Updated last year
- RoFormer升级版☆154Aug 11, 2022Updated 3 years ago
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,528Updated this week
- LightSeq: A High Performance Library for Sequence Processing and Generation☆3,303May 16, 2023Updated 2 years ago
- Train transformer language models with reinforcement learning.☆17,523Updated this week
- RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable)…☆14,393Feb 21, 2026Updated 2 weeks ago
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆10,356Feb 20, 2026Updated 2 weeks ago
- 以词为基本单位的中文BERT☆476Nov 18, 2021Updated 4 years ago
- GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)☆7,669Jul 25, 2023Updated 2 years ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,706Feb 27, 2026Updated last week
- PyTorch extensions for high performance and large scale training.☆3,400Apr 26, 2025Updated 10 months ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,926Feb 24, 2026Updated last week
- 全局指针统一处理嵌套与非嵌套NER☆259May 2, 2021Updated 4 years ago
- Keras implement of Finite Scalar Quantization☆85Oct 31, 2023Updated 2 years ago
- Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo☆3,106May 9, 2024Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆2,230Aug 14, 2025Updated 6 months ago
- QLoRA: Efficient Finetuning of Quantized LLMs☆10,843Jun 10, 2024Updated last year
- Foundation Architecture for (M)LLMs☆3,135Apr 11, 2024Updated last year
- ☆1,559Feb 20, 2026Updated 2 weeks ago
- ☆880May 24, 2024Updated last year
- 简单的向量白化改善句向量质量☆487Jun 17, 2021Updated 4 years ago