snudm-starlab / PETLinks
PET: Parameter-efficient Knowledge Distillation on Transformer (PLOS One)
☆15Updated 2 weeks ago
Alternatives and similar repositories for PET
Users that are interested in PET are comparing it to the libraries listed below
Sorting:
- Falcon: Lightweight and Accurate Convolution Based on Depthwise Separable Convolution (KAIS)☆44Updated 2 weeks ago
- Flexible Convolutional Neural Network☆22Updated last year
- SensiMix: Sensitivity-Aware 8-bit Index & 1-bit Value Mixed Precision Quantization for BERT Compression (PLOS One)☆34Updated 2 weeks ago
- ☆33Updated 2 years ago
- Pea-KD: Parameter-efficient and accurate knowledge distillation on BERT (PLOS One)☆35Updated 2 weeks ago
- Vector multiplication on Low-rank Matrix Factorization☆46Updated last year
- Sturctured pruning algorithm for pruning Transformer☆31Updated last year
- SynQ: Accurate Zero-shot Quantization by Synthesis-aware Fine-tuning (ICLR 2025)☆28Updated 6 months ago
- 🪖 [전문연구요원] 데이터 적재/시각화 및 블로그 정리 🪖☆71Updated 2 weeks ago
- 추천 시스템 관련 자료 모음☆85Updated last year
- [Zoom & Facebook Live] Weekly AI Arxiv 시즌2☆967Updated 2 years ago
- 카카오 추천팀 공개 리포지토리입니다.☆346Updated last week
- Review papers of NLP, mainly LLM.☆33Updated last year
- 추천시스템 논문을 읽고 구현한 Code가 저장된 Repository☆67Updated 2 years ago
- 부스트캠프 AI Tech - Product Serving 자료