Caiyun-AI / MUDDFormerLinks
☆81Updated 2 months ago
Alternatives and similar repositories for MUDDFormer
Users that are interested in MUDDFormer are comparing it to the libraries listed below
Sorting:
- [ICML 2025] Fourier Position Embedding: Enhancing Attention’s Periodic Extension for Length Generalization☆84Updated 2 months ago
- ☆213Updated 5 months ago
- [COLM 2025] LoRI: Reducing Cross-Task Interference in Multi-Task Low-Rank Adaptation☆145Updated last month
- Parameter-Efficient Fine-Tuning for Foundation Models☆81Updated 4 months ago
- Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models☆323Updated 5 months ago
- ☆147Updated 11 months ago
- ZO2 (Zeroth-Order Offloading): Full Parameter Fine-Tuning 175B LLMs with 18GB GPU Memory☆167Updated 3 weeks ago
- ☆196Updated last year
- A generalized framework for subspace tuning methods in parameter efficient fine-tuning.☆153Updated last month
- ☆45Updated last month
- [ICLR 2025 Spotlight] Official Implementation for ToST (Token Statistics Transformer)☆113Updated 5 months ago
- qwen-nsa☆71Updated 4 months ago
- [EMNLP 2024] RWKV-CLIP: A Robust Vision-Language Representation Learner☆140Updated 2 months ago
- A Tight-fisted Optimizer☆48Updated 2 years ago
- Official code for our paper, "LoRA-Pro: Are Low-Rank Adapters Properly Optimized? "☆127Updated 4 months ago
- ☆70Updated 6 months ago
- Lion and Adam optimization comparison☆63Updated 2 years ago
- ☆101Updated last month
- Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficien…☆114Updated 3 weeks ago
- [ICML 2025 Oral] Mixture of Lookup Experts☆47Updated 2 months ago
- CPPO: Accelerating the Training of Group Relative Policy Optimization-Based Reasoning Models☆147Updated 2 months ago
- ☆206Updated 9 months ago
- A repository for DenseSSMs☆88Updated last year
- tinybig for deep function learning☆61Updated 2 months ago
- My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing o…☆43Updated 7 months ago
- Ring is a reasoning MoE LLM provided and open-sourced by InclusionAI, derived from Ling.☆89Updated this week
- TransMLA: Multi-Head Latent Attention Is All You Need☆337Updated 3 weeks ago
- ☆112Updated last year
- The official GitHub page for the survey paper "Discrete Tokenization for Multimodal LLMs: A Comprehensive Survey". And this paper is unde…☆35Updated this week
- DeepSeek Native Sparse Attention pytorch implementation☆86Updated this week