usc-isi / PipeEdge
PipeEdge: Pipeline Parallelism for Large-Scale Model Inference on Heterogeneous Edge Devices
☆30Updated last year
Alternatives and similar repositories for PipeEdge:
Users that are interested in PipeEdge are comparing it to the libraries listed below
- This is a list of awesome edgeAI inference related papers.☆95Updated last year
- ☆99Updated last year
- ☆39Updated 4 years ago
- a deep learning-driven scheduler for elastic training in deep learning clusters☆29Updated 4 years ago
- ☆13Updated 5 years ago
- A Cluster-Wide Model Manager to Accelerate DNN Training via Automated Training Warmup☆34Updated 2 years ago
- MobiSys#114☆21Updated last year
- A Deep Learning Cluster Scheduler☆37Updated 4 years ago
- 云边协同- collaborative inference📚Dynamic adaptive DNN surgery for inference acceleration on the edge☆34Updated last year
- HeliosArtifact☆20Updated 2 years ago
- ☆77Updated last year
- Official Repo for "LLM-PQ: Serving LLM on Heterogeneous Clusters with Phase-Aware Partition and Adaptive Quantization"☆28Updated last year
- ☆14Updated 7 months ago
- ☆17Updated last year
- ☆20Updated 3 years ago
- ☆199Updated last year
- Lucid: A Non-Intrusive, Scalable and Interpretable Scheduler for Deep Learning Training Jobs☆53Updated last year
- A curated list of early exiting (LLM, CV, NLP, etc)☆44Updated 7 months ago
- Autodidactic Neurosurgeon Collaborative Deep Inference for Mobile Edge Intelligence via Online Learning☆41Updated 3 years ago
- PyTorch implementation of the paper: Decomposing Vision Transformers for Collaborative Inference in Edge Devices☆12Updated 7 months ago
- LLM serving cluster simulator☆93Updated 10 months ago
- ☆40Updated 8 months ago
- Code for "Solving Large-Scale Granular Resource Allocation Problems Efficiently with POP", which appeared at SOSP 2021☆25Updated 3 years ago
- Create tiny ML systems for on-device learning.☆20Updated 3 years ago
- A Portable C Library for Distributed CNN Inference on IoT Edge Clusters☆83Updated 5 years ago
- ☆16Updated last year
- ☆49Updated 2 years ago
- INFOCOM 2024: Online Resource Allocation for Edge Intelligence with Colocated Model Retraining and Inference☆12Updated 5 months ago
- Artifacts for our ASPLOS'23 paper ElasticFlow☆52Updated 10 months ago
- InFi is a library for building input filters for resource-efficient inference.☆38Updated last year