IntelLabs / distillerLinks
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
☆4,395Updated 2 years ago
Alternatives and similar repositories for distiller
Users that are interested in distiller are comparing it to the libraries listed below
Sorting:
- Differentiable architecture search for convolutional and recurrent networks☆3,962Updated 4 years ago
- Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)☆1,514Updated 4 years ago
- MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Co…☆5,809Updated last year
- A GPU-accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep lear…☆5,417Updated this week
- Model summary in PyTorch similar to `model.summary()` in Keras☆4,044Updated last year
- On the Variance of the Adaptive Learning Rate and Beyond☆2,550Updated 3 years ago
- Codebase for Image Classification Research, written in PyTorch.☆2,154Updated last year
- Count the MACs / FLOPs of your PyTorch model.☆4,995Updated 10 months ago
- PyTorch implementation of "Efficient Neural Architecture Search via Parameters Sharing"☆2,721Updated last year
- The convertor/conversion of deep learning models for different deep learning frameworks/softwares.☆3,248Updated last year
- Model analyzer in PyTorch☆1,481Updated 2 years ago
- A curated list of neural network pruning resources.☆2,449Updated last year
- [ICLR 2020] Once for All: Train One Network and Specialize it for Efficient Deployment☆1,918Updated last year
- Pretrained EfficientNet, EfficientNet-Lite, MixNet, MobileNetV3 / V2, MNASNet A1 and B1, FBNet, Single-Path NAS☆1,573Updated 11 months ago
- Flops counter for neural networks in pytorch framework☆2,898Updated 4 months ago
- An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.☆2,886Updated 2 years ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,670Updated 3 weeks ago
- Awesome Knowledge Distillation☆3,680Updated last week
- micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantiz…☆2,248Updated 3 weeks ago
- [ICLR 2019] ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware☆1,439Updated 9 months ago
- tensorboard for pytorch (and chainer, mxnet, numpy, ...)☆7,943Updated last month
- Collection of recent methods on (deep) neural network compression and acceleration.☆950Updated 2 months ago
- Tensorflow Backend for ONNX☆1,305Updated last year
- Quantized Neural Network PACKage - mobile-optimized implementation of quantized neural network operators☆1,541Updated 5 years ago
- Memory consumption and FLOP count estimates for convnets☆924Updated 6 years ago
- Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.☆9,091Updated 3 years ago
- High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.☆4,665Updated 3 weeks ago
- PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference☆880Updated 5 years ago
- Unofficial implementation of the ImageNet, CIFAR 10 and SVHN Augmentation Policies learned by AutoAugment using pillow☆1,484Updated 2 years ago
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,940Updated 2 years ago