☆34Aug 20, 2023Updated 2 years ago
Alternatives and similar repositories for Spherical-Knowledge-Distillation
Users that are interested in Spherical-Knowledge-Distillation are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Codes for ECCV2020 paper "Improving Knowledge Distillation via Category Structure".☆10Mar 15, 2021Updated 5 years ago
- This is the implementaion of paper "Task-Oriented Feature Distillation"☆43Apr 25, 2022Updated 3 years ago
- ☆59Jun 18, 2021Updated 4 years ago
- Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)☆119Feb 9, 2021Updated 5 years ago
- Code for Paper "Self-Distillation from the Last Mini-Batch for Consistency Regularization"☆44Sep 27, 2022Updated 3 years ago
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Pytorch implementation of our paper accepted by IEEE TNNLS, 2022 -- Distilling a Powerful Student Model via Online Knowledge Distillation☆31Nov 11, 2021Updated 4 years ago
- ☆47Sep 9, 2021Updated 4 years ago
- Official implementation of the paper "Function-Consistent Feature Distillation" (ICLR 2023)☆31Jul 5, 2023Updated 2 years ago
- Code of "Robustifying Token Attention for Vision Transformers"☆20Dec 31, 2023Updated 2 years ago
- [NeurIPS'22] Projector Ensemble Feature Distillation☆30Jan 4, 2024Updated 2 years ago
- Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization☆584Feb 15, 2023Updated 3 years ago
- EfficientVLM: Fast and Accurate Vision-Language Models via Knowledge Distillation and Modal-adaptive Pruning (ACL 2023)☆34Jul 18, 2023Updated 2 years ago
- Vectorgraph Image Painter☆12Mar 24, 2019Updated 7 years ago
- ☆19May 28, 2020Updated 5 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- 2019~2021年间Zero-shot/Data-free知识蒸馏的论文合集☆11Sep 8, 2021Updated 4 years ago
- A pytorch implementation of the ICCV2021 workshop paper SimDis: Simple Distillation Baselines for Improving Small Self-supervised Models☆14Jul 15, 2021Updated 4 years ago
- This project attacked widely-used EEG spellers, e.g., P300 spellers and SSVEP spellers, with adversarial perturbation templates. We cons…☆14Jul 14, 2020Updated 5 years ago
- The PyTorch implementation of DANN (Domain-Adversarial Training of Neural Networks).☆10Dec 4, 2023Updated 2 years ago
- Official PyTorch implementation of PS-KD☆88Aug 5, 2022Updated 3 years ago
- ☆13Jan 17, 2018Updated 8 years ago
- A deep learning project to classify brainwave signals.☆14Jan 27, 2019Updated 7 years ago
- ☆10Feb 22, 2022Updated 4 years ago
- This is the code of CVPR'20 paper "Distilling Cross-Task Knowledge via Relationship Matching".☆48Feb 7, 2021Updated 5 years ago
- Serverless GPU API endpoints on Runpod - Bonus Credits • AdSkip the infrastructure headaches. Auto-scaling, pay-as-you-go, no-ops approach lets you focus on innovating your application.
- An Open-Source GUI for CNN-based EEG Decoding and Model Interpretation☆20Sep 8, 2022Updated 3 years ago
- SEED: Self-supervised Distillation for Visual Representation☆16Jul 20, 2022Updated 3 years ago
- ☆23May 6, 2022Updated 3 years ago
- This repository contains the implementation of FAPM (2023 ICASSP).☆25Jun 19, 2023Updated 2 years ago
- 🏆 The 2nd Place Submission to the CVPR2021-Evoked Emotion from Videos challenge.☆17Jun 7, 2021Updated 4 years ago
- Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)☆106Sep 9, 2019Updated 6 years ago
- Distilling Knowledge via Intermediate Classifiers☆16Oct 3, 2021Updated 4 years ago
- ☆11Sep 1, 2024Updated last year
- DropNet: Reducing Neural Network Complexity via Iterative Pruning (ICML 2020)☆16Aug 24, 2020Updated 5 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- ☆11Nov 5, 2024Updated last year
- [AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"☆180Dec 3, 2024Updated last year
- Source Code for "Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation", TNNLS, https://ieeexplore.ieee.org/abstract/…☆12Dec 21, 2022Updated 3 years ago
- ☆114Apr 21, 2021Updated 4 years ago
- ACCV2022 Source Code of paper "Feature Decoupled Knowledge Distillation via Spatial Pyramid Pooling"☆12Jul 5, 2023Updated 2 years ago
- Codes for paper "Few Shot Network Compression via Cross Distillation", AAAI 2020.☆30Jan 31, 2020Updated 6 years ago
- The comparsion methods code☆12Mar 7, 2022Updated 4 years ago