☆27Jun 28, 2022Updated 3 years ago
Alternatives and similar repositories for TaT
Users that are interested in TaT are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- [ICML2024] DetKDS: Knowledge Distillation Search for Object Detectors☆19Jul 11, 2024Updated last year
- This repository is a PyTorch implementation for NIPS 2024 Paper "Reinforced Cross-Domain Knowledge Distillation on Time Series Data".☆16Sep 26, 2024Updated last year
- OpenMMLab Detection Toolbox and Benchmark☆27Mar 4, 2024Updated 2 years ago
- ☆31Jun 18, 2020Updated 5 years ago
- ☆10Jan 27, 2022Updated 4 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- IEEE Transactions on Intelligent Transportation Systems (2024)☆24Jul 22, 2025Updated 8 months ago
- ☆34Sep 15, 2021Updated 4 years ago
- ☆22Oct 16, 2025Updated 6 months ago
- ☆87Aug 31, 2023Updated 2 years ago
- Comparing CNN+Softmax with CNN+SVM on CIFAR 10 Dataset☆15Jan 26, 2019Updated 7 years ago
- "Segmenter: Transformer for Semantic Segmentation" reproduced via mmsegmentation☆24Aug 13, 2021Updated 4 years ago
- GrFormer: A Novel Transformer on Grassmann Manifold for Infrared and Visible Image Fusion☆18Dec 14, 2025Updated 4 months ago
- Sketch Based Image Retrieval☆10Jul 13, 2018Updated 7 years ago
- ☆10Feb 22, 2022Updated 4 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- ☆47Sep 9, 2021Updated 4 years ago
- The official implementation for paper: Improving Knowledge Distillation via Regularizing Feature Norm and Direction☆24Aug 3, 2023Updated 2 years ago
- SEED: Self-supervised Distillation for Visual Representation☆16Jul 20, 2022Updated 3 years ago
- ☆25Nov 6, 2024Updated last year
- channel pruning for accelerating very deep neural networks☆13Mar 8, 2021Updated 5 years ago
- ☆10Dec 13, 2022Updated 3 years ago
- ☆10Feb 4, 2025Updated last year
- Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.☆71Apr 14, 2023Updated 3 years ago
- DropNet: Reducing Neural Network Complexity via Iterative Pruning (ICML 2020)☆16Aug 24, 2020Updated 5 years ago
- AI Agents on DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- Minute-long video generation at 24FPS.☆61Mar 28, 2026Updated 2 weeks ago
- This repo uses a combination of logits and feature distillation method to teach the PSPNet model of ResNet18 backbone with the PSPNet mod…☆11Sep 30, 2021Updated 4 years ago
- [CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".☆103Jun 16, 2022Updated 3 years ago
- ☆15Mar 21, 2025Updated last year
- ☆17Mar 4, 2024Updated 2 years ago
- Pytorch implementation for paper "Bounding Box Tightness Prior for Weakly Supervised Image Segmentation" published in MICCAI 2021☆24Oct 15, 2022Updated 3 years ago
- [ACM MM'23] Official implementation of paper "Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty".☆14Nov 22, 2023Updated 2 years ago
- MICCAI 2022 Paper☆24Sep 21, 2023Updated 2 years ago
- ☆49Aug 23, 2022Updated 3 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Official PyTorch implementation of The Linear Attention Resurrection in Vision Transformer☆16Sep 7, 2024Updated last year
- ☆12Updated this week
- PyTorch code and checkpoints release for VanillaKD: https://arxiv.org/abs/2305.15781☆77Nov 21, 2023Updated 2 years ago
- Submission for MICCAI HACKATHON: https://miccai-hackathon.com/#participate☆15Jul 19, 2023Updated 2 years ago
- ☆31Jul 1, 2024Updated last year
- ☆16Nov 25, 2021Updated 4 years ago
- Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)☆30Aug 19, 2020Updated 5 years ago