Lyken17 / pytorch-OpCounterLinks
Count the MACs / FLOPs of your PyTorch model.
☆5,048Updated last year
Alternatives and similar repositories for pytorch-OpCounter
Users that are interested in pytorch-OpCounter are comparing it to the libraries listed below
Sorting:
- Flops counter for neural networks in pytorch framework☆2,949Updated last month
- Model analyzer in PyTorch☆1,497Updated 2 years ago
- Collection of common code that's shared among different research projects in FAIR computer vision team.☆2,188Updated last month
- ResNeSt: Split-Attention Networks☆3,264Updated 2 years ago
- RepVGG: Making VGG-style ConvNets Great Again☆3,423Updated 2 years ago
- [ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods☆2,378Updated last year
- Model summary in PyTorch similar to `model.summary()` in Keras☆4,061Updated last year
- Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.☆4,314Updated 6 months ago
- PyTorch implementation of SENet☆2,336Updated 4 years ago
- A PyTorch implementation of EfficientNet☆8,162Updated 3 years ago
- A quickstart and benchmark for pytorch distributed training.☆1,666Updated last year
- A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility☆1,966Updated 2 years ago
- Codebase for Image Classification Research, written in PyTorch.☆2,164Updated last year
- A CV toolkit for my papers.☆2,049Updated 9 months ago
- label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful☆2,250Updated 11 months ago
- Classification with PyTorch.☆1,741Updated last year
- Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"☆2,183Updated 2 years ago
- A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch☆8,818Updated last week
- Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。☆2,628Updated 2 years ago
- Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distille…☆4,400Updated 2 years ago
- A curated list of neural network pruning resources.☆2,477Updated last year
- Awesome Knowledge Distillation☆3,746Updated 4 months ago
- Synchronized Batch Normalization implementation in PyTorch.☆1,501Updated 4 years ago
- PyTorch implementation of MoCo: https://arxiv.org/abs/1911.05722☆5,078Updated 2 weeks ago
- Pytorch implementation of various Knowledge Distillation (KD) methods.☆1,718Updated 3 years ago
- Pretrained ConvNets for pytorch: NASNet, ResNeXt, ResNet, InceptionV4, InceptionResnetV2, Xception, DPN, etc.☆9,106Updated 3 years ago
- Gradually-Warmup Learning Rate Scheduler for PyTorch☆991Updated last year
- Official DeiT repository☆4,267Updated last year
- My best practice of training large dataset using PyTorch.☆1,104Updated last year
- mobilenetv3 with pytorch,provide pre-train model☆1,773Updated 2 years ago