tangchen2 / Model-Compression

Model Compression 1. Pruning(BN Pruning) 2. Knowledge Distillation (Hinton) 3. Quantization (MNN) 4. Deployment (MNN)
78Updated 4 years ago

Alternatives and similar repositories for Model-Compression:

Users that are interested in Model-Compression are comparing it to the libraries listed below