lnairGT / CLIP-Distillation

Knowledge Distillation using Contrastive Language-Image Pretraining (CLIP) without a teacher model.
12Updated 6 months ago

Alternatives and similar repositories for CLIP-Distillation:

Users that are interested in CLIP-Distillation are comparing it to the libraries listed below