circlePi / Pretraining-Yourself-Bert-From-Scratch
从头训练MASK BERT
☆137Updated 2 years ago
Alternatives and similar repositories for Pretraining-Yourself-Bert-From-Scratch
Users that are interested in Pretraining-Yourself-Bert-From-Scratch are comparing it to the libraries listed below
Sorting:
- Pattern-Exploiting Training在中文上的简单实验☆171Updated 4 years ago
- 中文无监督SimCSE Pytorch实现☆134Updated 3 years ago
- ☆278Updated 3 years ago
- ☆87Updated 3 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆262Updated 3 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆228Updated 4 years ago
- 论文复现《Named Entity Recognition as Dependency Parsing》☆130Updated 3 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- using bilstm-crf,bert and other methods to do sequence tagging task☆415Updated last year
- Data Augmentation with a Generation Approach for Low-resource Tagging Tasks☆80Updated 4 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- transformer crf 命名实体识别☆106Updated 6 years ago
- implement of LSTM+CRF with pytorch☆43Updated 4 years ago
- EMNLP-2019 paper: A Lexicon-based Graph Neural Network for Chinese NER.☆137Updated 10 months ago
- bert annotation, input and output for people from scratch, 代码注释, 有每一步的输入和输出, 适合初学者☆93Updated 2 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- 全局指针统一处理嵌套与非嵌套NER☆253Updated 4 years ago
- R-Drop方法在中文任务上的简单实验☆91Updated 3 years ago
- ☆278Updated 4 years ago
- Code for ACL 2019 : Entity-Relation Extraction as Multi-Turn Question Answering☆73Updated last year
- NER任务SOTA模型BERT_MRC☆61Updated last year
- Chinese NER using Lattice LSTM. Reproduction for ACL 2018 paper.☆129Updated 5 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 5 years ago
- ☆26Updated 4 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆101Updated 4 years ago
- baidu aistudio event extraction competition☆224Updated 2 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- Using pre-trained BERT models for Chinese and English NER with 🤗Transformers☆136Updated 4 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆192Updated 2 years ago