BitVoyage / FastBERTLinks
对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf
☆193Updated 3 years ago
Alternatives and similar repositories for FastBERT
Users that are interested in FastBERT are comparing it to the libraries listed below
Sorting:
- BERT distillation(基于BERT的蒸馏实验 )☆313Updated 4 years ago
- Bert finetune for CMRC2018, CJRC, DRCD, CHID, C3☆183Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆128Updated 4 years ago
- ☆279Updated 4 years ago
- 科赛网-莱斯杯:全国第二届“军事智能机器阅读”挑战赛 前十团队PPT文档代码总结☆132Updated 5 years ago
- Pattern-Exploiting Training在中文上的简单实验☆171Updated 4 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- Chinese Language Generation Evaluation 中文生成任务基准测评☆247Updated 4 years ago
- ☆91Updated 5 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆115Updated 2 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆261Updated 3 years ago
- 中文NLP数据集☆155Updated 5 years ago
- ☆156Updated 3 years ago
- Data Augmentation for NLP. NLP数据增强☆295Updated 4 years ago
- transformers implement (architecture, task example, serving and more)☆95Updated 3 years ago
- ☆89Updated 5 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆81Updated 5 years ago
- 整理一下在keras中使用T5模型的要点☆172Updated 3 years ago
- bert annotation, input and output for people from scratch, 代码注释, 有每一步的输入和输出, 适合初学者☆93Updated 2 years ago
- 从头训练MASK BERT☆137Updated 2 years ago
- 基于BERT的无监督分词和句法分析☆110Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 6 years ago
- 中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model☆140Updated 5 years ago
- ccf 2020 qa match competition top1☆266Updated 4 years ago
- Collections of Chinese reading comprehension datasets☆217Updated 5 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆227Updated 4 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆92Updated 5 years ago
- Rank2 solution (no-BERT) for 2019 Language and Intelligence Challenge - DuReader2.0 Machine Reading Comprehension.☆127Updated 5 years ago
- TensorFlow code and pre-trained models for BERT and ERNIE☆145Updated 6 years ago