☆60May 4, 2024Updated 2 years ago
Alternatives and similar repositories for Pregated_MoE
Users that are interested in Pregated_MoE are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Open-source of LazyDP published in ASPLOS-2024☆22May 5, 2024Updated 2 years ago
- ☆26Dec 3, 2025Updated 5 months ago
- ☆83May 27, 2025Updated 11 months ago
- ☆21Nov 27, 2025Updated 5 months ago
- GPU-based Distributed Point Functions (DPF) and 2-server private information retrieval (PIR).☆56Jan 27, 2023Updated 3 years ago
- GPU virtual machines on DigitalOcean Gradient AI • AdGet to production fast with high-performance AMD and NVIDIA GPUs you can spin up in seconds. The definition of operational simplicity.
- ☆16Dec 4, 2025Updated 5 months ago
- ☆173Feb 1, 2025Updated last year
- [ICML‘25] Official code for paper "Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training an…☆13Apr 17, 2025Updated last year
- Code release for AdapMoE accepted by ICCAD 2024☆38Apr 28, 2025Updated last year
- Explore Inter-layer Expert Affinity in MoE Model Inference☆16May 6, 2024Updated 2 years ago
- Tender: Accelerating Large Language Models via Tensor Decompostion and Runtime Requantization (ISCA'24)☆31Jul 4, 2024Updated last year
- Sirius, an efficient correction mechanism, which significantly boosts Contextual Sparsity models on reasoning tasks while maintaining its…☆21Sep 10, 2024Updated last year
- ☆133Nov 11, 2024Updated last year
- PyTorch library for cost-effective, fast and easy serving of MoE models.