cliang1453 / task-aware-distillationView on GitHub
Less is More: Task-aware Layer-wise Distillation for Language Model Compression (ICML2023)
40Aug 28, 2023Updated 2 years ago

Alternatives and similar repositories for task-aware-distillation

Users that are interested in task-aware-distillation are comparing it to the libraries listed below

Sorting:

Are these results useful?