SilentMoebuta / simple_bert_for_tf2
Build bert as a keras layer using TF2.0 .
☆18Updated last year
Related projects: ⓘ
- some demos of Knowledge Distillation in NLP☆19Updated 3 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- bert multiple gpu train pretrain☆29Updated 4 years ago
- ☆32Updated 3 years ago
- Use deep models including BiLSTM, ABCNN, ESIM, RE2, BERT, etc. and evaluate on 5 Chinese NLP datasets: LCQMC, BQ Corpus, ChineseSTS, OCN…☆74Updated 2 years ago
- ☆15Updated 4 years ago
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务应用。☆47Updated 3 years ago
- 对苏神的bert4keras的实现原理和矩阵运算进行详细的注释,方便学习;bert4keras链接:https://github.com/bojone/bert4keras☆41Updated 3 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆114Updated last year
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆76Updated last year
- ☆91Updated 4 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆96Updated last year
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆73Updated 4 years ago
- chinese pretrain unilm☆28Updated 4 years ago
- 天池中药说明书实体识别挑战冠军方案;中文命名实体识别;NER; BERT-CRF & BERT-SPAN & BERT-MRC;Pytorch☆23Updated 3 years ago
- ☆30Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 这是使用pytoch 实现的长文本分类器☆45Updated 5 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 3 years ago
- Pattern-Exploiting Training在中文上的简单实验☆169Updated 3 years ago
- ☆84Updated 2 years ago
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 4 years ago
- NLP学习笔记-前沿追踪☆25Updated last year
- CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现☆27Updated 4 years ago
- 对话改写介绍文章☆96Updated last year
- NLP实验:新词挖掘+预训练模型继续Pre-training☆47Updated last year
- transformers implement (architecture, task example, serving and more)☆97Updated 2 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆101Updated 3 years ago
- 多标签文本分类☆27Updated 2 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆81Updated 4 years ago