cliang1453 / task-aware-distillation

Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
32Updated last year

Alternatives and similar repositories for task-aware-distillation:

Users that are interested in task-aware-distillation are comparing it to the libraries listed below