[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
☆121May 22, 2023Updated 2 years ago
Alternatives and similar repositories for deformer
Users that are interested in deformer are comparing it to the libraries listed below
Sorting:
- CommonsenseQA☆10Mar 28, 2020Updated 5 years ago
- LGEB: Benchmark of Language Generation Evaluation☆16Oct 21, 2022Updated 3 years ago
- The score code of FastBERT (ACL2020)☆609Oct 29, 2021Updated 4 years ago
- ☆11Jul 17, 2020Updated 5 years ago
- ☆24Jun 14, 2019Updated 6 years ago
- Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".☆340Oct 11, 2019Updated 6 years ago
- Code for the paper "multi-hop paragraph retrieval for open-domain question answering"☆36Jun 21, 2022Updated 3 years ago
- Source code for our "MMM" paper at AAAI 2020☆40May 4, 2020Updated 5 years ago
- Implementation for paper " Unsupervised Domain Adaptation on Reading Comprehension "☆30May 21, 2020Updated 5 years ago
- 中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model☆141Mar 22, 2020Updated 5 years ago
- AIR retriever for Multi-Hop QA (ACL 2020 paper)☆30Jul 18, 2020Updated 5 years ago
- Author Profiling for Abuse Detection (COLING 2018)☆10Dec 8, 2022Updated 3 years ago
- ☆279Dec 8, 2020Updated 5 years ago
- Implementation of RealFormer using pytorch☆101Dec 27, 2020Updated 5 years ago
- ☆10Apr 16, 2021Updated 4 years ago
- BERT Baseline for the Natural Questions☆11Jan 24, 2019Updated 7 years ago
- Source code for ACL 2021 paper "ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learni…☆85May 26, 2021Updated 4 years ago
- This is the PyTorch implementation of the ACL 2019 paper RankQA: Neural Question Answering with Answer Re-Ranking.☆83Dec 13, 2021Updated 4 years ago
- The dataset and PyTorch Implementation for ACL 2020 paper "MATINF: A Jointly Labeled Large-Scale Dataset for Classification, Question Ans…☆43Sep 7, 2020Updated 5 years ago
- Code for using and evaluating SpanBERT.☆904Jul 25, 2023Updated 2 years ago
- ☆48Jan 21, 2021Updated 5 years ago
- This is the pytorch implementation of the long paper on ACL 2020: A Self-Training Method for Machine Reading Comprehension with Soft Evid…☆34Aug 14, 2020Updated 5 years ago
- Cross-Lingual Machine Reading Comprehension (EMNLP 2019)☆67Nov 6, 2019Updated 6 years ago
- A pytorch implementation of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".☆306Aug 24, 2022Updated 3 years ago
- This is the official code repository for NumNet+(https://leaderboard.allenai.org/drop/submission/blu418v76glsbnh1qvd0)☆176Jul 25, 2024Updated last year
- Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer☆541Dec 10, 2021Updated 4 years ago
- A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型☆3,984Nov 21, 2022Updated 3 years ago
- Source code of K-BERT (AAAI2020)☆984Jan 27, 2023Updated 3 years ago
- ☆81Apr 29, 2018Updated 7 years ago
- Source code and dataset for ACL 2019 paper "Cognitive Graph for Multi-Hop Reading Comprehension at Scale"☆458Mar 31, 2023Updated 2 years ago
- Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".☆129Jun 30, 2021Updated 4 years ago
- 中文空间语义理解评测☆39Aug 10, 2022Updated 3 years ago
- A list of recent papers on knowledge-based machine reading comprehension.☆26Aug 4, 2020Updated 5 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆314Jul 30, 2020Updated 5 years ago
- Method to improve inference time for BERT. This is an implementation of the paper titled "PoWER-BERT: Accelerating BERT Inference via Pro…☆62Sep 17, 2025Updated 5 months ago
- ☆64Jul 17, 2020Updated 5 years ago
- Code for the paper "Adaptive Transformers for Learning Multimodal Representations" (ACL SRW 2020)☆43Oct 20, 2022Updated 3 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆66Mar 30, 2020Updated 5 years ago
- 中文bigbird预训练模型☆96Jul 5, 2022Updated 3 years ago