yoshitomo-matsubara / torchdistillView on GitHub
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. πŸ†26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
β˜†1,607Mar 31, 2026Updated last week

Alternatives and similar repositories for torchdistill

Users that are interested in torchdistill are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.

Sorting:

Are these results useful?