circlePi / Pretraining-Yourself-Bert-From-Scratch
从头训练MASK BERT
☆136Updated 2 years ago
Alternatives and similar repositories for Pretraining-Yourself-Bert-From-Scratch:
Users that are interested in Pretraining-Yourself-Bert-From-Scratch are comparing it to the libraries listed below
- Pattern-Exploiting Training在中文上的简单实验☆170Updated 4 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆262Updated 3 years ago
- 中文无监督SimCSE Pytorch实现☆133Updated 3 years ago
- ☆88Updated 3 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- ☆277Updated 2 years ago
- Use deep models including BiLSTM, ABCNN, ESIM, RE2, BERT, etc. and evaluate on 5 Chinese NLP datasets: LCQMC, BQ Corpus, ChineseSTS, OCN…☆76Updated 2 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆229Updated 4 years ago
- ☆278Updated 4 years ago
- using bilstm-crf,bert and other methods to do sequence tagging task☆414Updated last year
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆97Updated 2 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 4 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆139Updated 4 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆175Updated 3 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- implement of LSTM+CRF with pytorch☆43Updated 4 years ago
- 中文NLP数据集☆153Updated 5 years ago
- bert annotation, input and output for people from scratch, 代码注释, 有每一步的输入和输出, 适合初学者☆93Updated 2 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆114Updated last year
- NER任务SOTA模型BERT_MRC☆60Updated 11 months ago
- 全球人工智能技术创新大赛-赛道三-冠军方案☆237Updated 3 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- 真 · “Deep Learning for Humans”☆141Updated 3 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- 法研杯2019 阅读理解赛道 top3☆150Updated last year
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆311Updated 4 years ago
- ☆27Updated 4 years ago
- 天池大赛疫情文本挑战赛☆48Updated 4 years ago