cliang1453 / task-aware-distillationLinks

Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
36Updated 2 years ago

Alternatives and similar repositories for task-aware-distillation

Users that are interested in task-aware-distillation are comparing it to the libraries listed below

Sorting: