LongmaoTeamTf / keras_ernie
Pre-trained ERNIE models could be loaded with Keras for feature extraction and prediction.
☆26Updated 2 years ago
Alternatives and similar repositories for keras_ernie:
Users that are interested in keras_ernie are comparing it to the libraries listed below
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- 将百度ernie的paddlepaddle模型转成tensorflow模型☆177Updated 5 years ago
- A Lite BERT☆59Updated 5 years ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆74Updated 4 years ago
- CCF BDCI 金融信息负面及主体判定 冠军代码☆105Updated 5 years ago
- 基于ELMo, tensorflow的中文命名实体标注 Chinese Named Entity Recognition Based on ELMo☆21Updated 5 years ago
- NLP的数据增强Demo☆47Updated 5 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated last year
- Tensorflow solution of NER task Using BiLSTM-CRF model with CMU/Google XLNet☆45Updated 5 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆91Updated 5 years ago
- ☆91Updated 4 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆89Updated 4 years ago
- First place solution of WSDM CUP 2020, pairwise-bert, lightgbm☆89Updated 5 years ago
- tensorflow version of bert-of-theseus☆62Updated 4 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- 2019语言与智能技术竞赛-基于知识图谱的主动聊天☆115Updated 5 years ago
- 中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model☆140Updated 5 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- 天池-新冠疫情相似句对判定大赛 Rank8☆52Updated 4 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 相似案例匹配☆46Updated 5 years ago
- 基于BERT的无监督分词和句法分析☆110Updated 4 years ago
- NLP NER datasets video/music/book bio☆88Updated 4 years ago
- ☆89Updated 4 years ago
- Final Project for EECS496-7☆62Updated 6 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆88Updated last year