☆34Aug 20, 2023Updated 2 years ago
Alternatives and similar repositories for Spherical-Knowledge-Distillation
Users that are interested in Spherical-Knowledge-Distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Mar 15, 2021Updated 5 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 4 years ago
- ☆59Jun 18, 2021Updated 4 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆43Sep 27, 2022Updated 3 years ago
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- Managed Kubernetes at scale on DigitalOcean • AdDigitalOcean Kubernetes includes the control plane, bandwidth allowance, container registry, automatic updates, and more for free.
- ☆47Sep 9, 2021Updated 4 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆31Jul 5, 2023Updated 2 years ago
- Code for "Self-Distillation as Instance-Specific Label Smoothing"☆15Oct 22, 2020Updated 5 years ago
- Code of "Robustifying Token Attention for Vision Transformers"☆20Dec 31, 2023Updated 2 years ago
- Offical Code for Paper "Exploring Inter-Channel Correlation for Diversity-preserved Knowledge Distillation"☆17Jan 19, 2022Updated 4 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆583Feb 15, 2023Updated 3 years ago
- A simple, fast, efficient and end-to-end 3D object detector without NMS.☆29Nov 30, 2021Updated 4 years ago
- EfficientVLM: Fast and Accurate Vision-Language Models via Knowledge Distillation and Modal-adaptive Pruning (ACL 2023)☆34Jul 18, 2023Updated 2 years ago
- Bare Metal GPUs on DigitalOcean Gradient AI • AdPurpose-built for serious AI teams training foundational models, running large-scale inference, and pushing the boundaries of what's possible.
- ☆19May 28, 2020Updated 5 years ago
- 2019~2021年间Zero-shot/Data-free知识蒸馏的论文合集☆11Sep 8, 2021Updated 4 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Jul 15, 2021Updated 4 years ago
- The PyTorch implementation of DANN (Domain-Adversarial Training of Neural Networks).☆10Dec 4, 2023Updated 2 years ago
- Official PyTorch implementation of PS-KD☆88Aug 5, 2022Updated 3 years ago
- Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf☆264Oct 3, 2019Updated 6 years ago
- 活体检测顶会论文及复习代码汇总☆13Apr 23, 2021Updated 5 years ago
- ☆10Feb 22, 2022Updated 4 years ago
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆48Feb 7, 2021Updated 5 years ago
- AI Agents on DigitalOcean Gradient AI Platform • AdBuild production-ready AI agents using customizable tools or access multiple LLMs through a single endpoint. Create custom knowledge bases or connect external data.
- kNN-TL: k-Nearest-Neighbor Transfer Learning for Low-Resource Neural Machine Translation (ACL2023)☆11Jul 26, 2023Updated 2 years ago
- SEED: Self-supervised Distillation for Visual Representation☆16Jul 20, 2022Updated 3 years ago
- ☆23May 6, 2022Updated 3 years ago
- This repository contains the implementation of FAPM (2023 ICASSP).☆25Jun 19, 2023Updated 2 years ago
- 🏆 The 2nd Place Submission to the CVPR2021-Evoked Emotion from Videos challenge.☆17Jun 7, 2021Updated 4 years ago
- channel pruning for accelerating very deep neural networks☆13Mar 8, 2021Updated 5 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- DropNet: Reducing Neural Network Complexity via Iterative Pruning (ICML 2020)☆16Aug 24, 2020Updated 5 years ago
- End-to-end encrypted email - Proton Mail • AdSpecial offer: 40% Off Yearly / 80% Off First Month. All Proton services are open source and independently audited for security.
- official pytorch implementation of "Deep Metric Learning with Spherical Embedding", NeurIPS 2020☆42Nov 28, 2020Updated 5 years ago
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆180Dec 3, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 5 years ago
- Codebase for " Reducing Representation Drift in Online Continual Learning"☆14Jun 8, 2021Updated 4 years ago
- Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"☆26Apr 19, 2022Updated 4 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago