cliang1453 / task-aware-distillation

Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
29Updated last year

Related projects

Alternatives and complementary repositories for task-aware-distillation