yanqiuxia / BERT-PreTrainLinks
不用tensorflow estimator,分别采用字mask和wwm mask在中文领域内finetune bert模型
☆23Updated 5 years ago
Alternatives and similar repositories for BERT-PreTrain
Users that are interested in BERT-PreTrain are comparing it to the libraries listed below
Sorting:
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 5 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- ☆92Updated 5 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆76Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆128Updated 4 years ago
- Text Matching Based on LCQMC: A Large-scale Chinese Question Matching Corpus☆15Updated 4 years ago
- CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现☆29Updated 5 years ago
- chinese pretrain unilm☆28Updated 5 years ago
- chinese wwm masking and ngram masking based on jieba☆11Updated 6 years ago
- NLP实验:新词挖掘+预训练模型继续Pre-training☆48Updated 2 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- ☆45Updated 4 years ago
- using lear to do ner extraction☆29Updated 3 years ago
- 迭代膨胀卷积命名实体抽取☆45Updated 6 years ago
- ☆129Updated 2 years ago
- bert_chinese☆39Updated 3 years ago
- ☆41Updated 3 years ago
- ☆19Updated 2 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- 在苏剑林老师的代码上改了一下,改成了python3.6,基于膨胀卷积,字词混合向量,radam梯度优化算法,百度百科词向量的阅读理解模型☆23Updated 6 years ago
- 2020语言与智能技术竞赛:关系抽取任务(https://aistudio.baidu.com/aistudio/competition/detail/31?lang=zh_CN)☆24Updated 5 years ago
- NLP的数据增强Demo☆48Updated 5 years ago
- Bert finetune for CMRC2018, CJRC, DRCD, CHID, C3☆184Updated 5 years ago
- 对话改写介绍文章☆98Updated 2 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆90Updated 5 years ago
- 这是使用pytoch 实现的长文本分类器☆46Updated 6 years ago
- Pytorch-BERT-CRF-NER;Chinese-Named-Entity-Recognition☆47Updated 4 years ago
- multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification☆28Updated 4 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆59Updated 4 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 5 years ago