yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. πŸ†25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
β˜†1,392Updated last month

Related projects β“˜

Alternatives and complementary repositories for torchdistill