thu-coai / MiniPLMLinks

[ICLR 2025] MiniPLM: Knowledge Distillation for Pre-Training Language Models
68Updated last year

Alternatives and similar repositories for MiniPLM

Users that are interested in MiniPLM are comparing it to the libraries listed below

Sorting: