Distilling BERT using natural language generation.
☆39Aug 13, 2023Updated 2 years ago
Alternatives and similar repositories for d-bert
Users that are interested in d-bert are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".☆129Jun 30, 2021Updated 4 years ago
- ☆14Mar 21, 2020Updated 6 years ago
- Tensorflow implementation of RankGan (Adversarial Ranking for Language Generation)☆22Jun 15, 2018Updated 7 years ago
- ☆11Nov 10, 2020Updated 5 years ago
- GPT-2 fine-tuned on seinfeld☆14May 20, 2019Updated 6 years ago
- Virtual machines for every use case on DigitalOcean • AdGet dependable uptime with 99.99% SLA, simple security tools, and predictable monthly pricing with DigitalOcean's virtual machines, called Droplets.
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.☆15Aug 28, 2020Updated 5 years ago
- [EMNLP 2022] Distillation-Resistant Watermarking (DRW) for Model Protection in NLP☆13Aug 17, 2023Updated 2 years ago
- Refined dataset for Stanford Sentiment Treebank used in Yoon Kim (2014).☆12Apr 1, 2018Updated 8 years ago
- Dirichlet Latent Variable Hierarchical Recurrent Encoder-Decoder in dialogue generation(EMNLP2019)