[SIGIR'24] The official implementation code of MOELoRA.
☆192Jul 22, 2024Updated last year
Alternatives and similar repositories for MOELoRA-peft
Users that are interested in MOELoRA-peft are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- LoRAMoE: Revolutionizing Mixture of Experts for Maintaining World Knowledge in Language Model Alignment☆404Apr 29, 2024Updated 2 years ago
- ☆179Jul 22, 2024Updated last year
- Adapt an LLM model to a Mixture-of-Experts model using Parameter Efficient finetuning (LoRA), injecting the LoRAs in the FFN.☆83Oct 21, 2025Updated 6 months ago
- ☆277Oct 31, 2023Updated 2 years ago
- An Efficient LLM Fine-Tuning Factory Optimized for MoE PEFT☆139Mar 11, 2025Updated last year
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Official Implementation of "Learning to Refuse: Towards Mitigating Privacy Risks in LLMs"☆10Dec 13, 2024Updated last year
- [SIGIR'24] The official implementation code of MOELoRA.☆37Aug 3, 2024Updated last year
- Token-level adaptation of LoRA matrices for downstream task generalization.☆15Apr 14, 2024Updated 2 years ago
- ☆126Jul 6, 2024Updated last year
- X-LoRA: Mixture of LoRA Experts☆270Aug 4, 2024Updated last year
- An Efficient "Factory" to Build Multiple LoRA Adapters☆376Feb 13, 2025Updated last year
- [NeurIPS'24 Spotlight] The official implementation code of LLM-ESR.☆51Jun 27, 2024Updated last year
- MoCLE (First MLLM with MoE for instruction customization and generalization!) (https://arxiv.org/abs/2312.12379)☆46Jul 1, 2025Updated 10 months ago
- This repository has transferred to https://github.com/TUDB-Labs/MoE-PEFT☆22Aug 16, 2024Updated last year
- Deploy to Railway using AI coding agents - Free Credits Offer • AdUse Claude Code, Codex, OpenCode, and more. Autonomous software development now has the infrastructure to match with Railway.
- [COLM 2024] LoraHub: Efficient Cross-Task Generalization via Dynamic LoRA Composition☆669Jul 22, 2024Updated last year
- DiTASK: Multi-Task Fine-Tuning with Diffeomorphic Transformations (CVPR 2025)☆14Jun 1, 2025Updated 11 months ago
- Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning☆36Nov 17, 2024Updated last year
- ☆44Oct 1, 2024Updated last year
- ☆16Nov 12, 2024Updated last year
- Efficient and Effective Weight-Ensembling Mixture of Experts for Multi-Task Model Merging. Arxiv, 2024.☆16Oct 28, 2024Updated last year
- Butler 是一个用于自动化服务管理和任务调度的工具项目。☆16Apr 19, 2026Updated 2 weeks ago
- ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)☆1,001Dec 6, 2024Updated last year
- ☆201Jul 13, 2024Updated last year
- End-to-end encrypted email - Proton Mail • AdSpecial offer: 40% Off Yearly / 80% Off First Month. All Proton services are open source and independently audited for security.
- ☆68Dec 2, 2024Updated last year
- Source code of paper: A Stronger Mixture of Low-Rank Experts for Fine-Tuning Foundation Models. (ICML 2025)☆39Apr 2, 2025Updated last year
- Code for PHATGOOSE introduced in "Learning to Route Among Specialized Experts for Zero-Shot Generalization"☆92Feb 27, 2024Updated 2 years ago
- Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks (EMNLP'24)☆146Sep 20, 2024Updated last year
- ☆10Apr 16, 2024Updated 2 years ago
- Official implementation of "DoRA: Weight-Decomposed Low-Rank Adaptation"☆123Apr 28, 2024Updated 2 years ago
- [NeurIPS 2024 D&B] Evaluating Copyright Takedown Methods for Language Models☆17Jul 17, 2024Updated last year
- Official repo for NeurIPS'24 paper "WAGLE: Strategic Weight Attribution for Effective and Modular Unlearning in Large Language Models"☆19Dec 16, 2024Updated last year
- The collections of MOE (Mixture Of Expert) papers, code and tools, etc.☆12Mar 15, 2024Updated 2 years ago
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Generated geosite.dat based on Antifilter Community List☆26Apr 26, 2026Updated last week
- ☆77Apr 29, 2024Updated 2 years ago
- 🌏 UI component library for the future, based on WebComponent.☆23Nov 12, 2024Updated last year
- ☆415Nov 2, 2023Updated 2 years ago
- Code and data for the FACTOR paper☆53Nov 15, 2023Updated 2 years ago
- ☆15Apr 29, 2021Updated 5 years ago
- ☆32Aug 9, 2024Updated last year