cambridgeltl / mirror-bert

[EMNLP'21] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
75Updated 2 years ago

Related projects: