circlePi / Pretraining-Yourself-Bert-From-ScratchLinks
从头训练MASK BERT
☆138Updated 2 years ago
Alternatives and similar repositories for Pretraining-Yourself-Bert-From-Scratch
Users that are interested in Pretraining-Yourself-Bert-From-Scratch are comparing it to the libraries listed below
Sorting:
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆261Updated 4 years ago
- Pattern-Exploiting Training在中文上的简单实验☆174Updated 4 years ago
- ☆157Updated 4 years ago
- 科赛网-莱斯杯:全国第二届“军事智能机器阅读”挑战赛 前十团队PPT文档代码总结☆134Updated 5 years ago
- 法研杯2019 阅读理解赛道 top3☆150Updated last year
- Label Mask for Multi-label Classification☆57Updated 4 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆194Updated 3 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆128Updated 4 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆228Updated 4 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆76Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆90Updated 6 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆315Updated 5 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆82Updated 5 years ago
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆117Updated 2 years ago
- ☆87Updated 3 years ago
- Bert finetune for CMRC2018, CJRC, DRCD, CHID, C3☆184Updated 5 years ago
- Chinese NER using Lattice LSTM. Reproduction for ACL 2018 paper.☆130Updated 5 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- 使用UniLM实现中文文本摘要☆44Updated 5 years ago
- 论文复现《Named Entity Recognition as Dependency Parsing》☆131Updated 4 years ago
- ☆34Updated 4 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- ccf 2020 qa match competition top1☆267Updated 4 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- 全球人工智能技术创新大赛-赛道三-冠军方案☆239Updated 4 years ago
- some baselines for lic2020 (http://lic2020.cipsc.org.cn/)☆219Updated 5 years ago
- 天池大赛疫情文本挑战赛☆51Updated 5 years ago
- ☆280Updated 4 years ago
- Rank2 solution (no-BERT) for 2019 Language and Intelligence Challenge - DuReader2.0 Machine Reading Comprehension.☆128Updated 5 years ago