cliang1453 / task-aware-distillationLinks

Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
35Updated last year

Alternatives and similar repositories for task-aware-distillation

Users that are interested in task-aware-distillation are comparing it to the libraries listed below

Sorting: