tangchen2 / Model-CompressionView on GitHub
Model Compression 1. Pruning(BN Pruning) 2. Knowledge Distillation (Hinton) 3. Quantization (MNN) 4. Deployment (MNN)
80Dec 17, 2020Updated 5 years ago

Alternatives and similar repositories for Model-Compression

Users that are interested in Model-Compression are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?