thaonguyen19 / ModelDistillation-PyTorchView on GitHub
PyTorch implementation of "Distilling the Knowledge in a Neural Network" for model compression
59Nov 18, 2017Updated 8 years ago

Alternatives and similar repositories for ModelDistillation-PyTorch

Users that are interested in ModelDistillation-PyTorch are comparing it to the libraries listed below

Sorting:

Are these results useful?