wangbq18 / distillation_model_keras_bertLinks
CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现
☆29Updated 5 years ago
Alternatives and similar repositories for distillation_model_keras_bert
Users that are interested in distillation_model_keras_bert are comparing it to the libraries listed below
Sorting:
- NLP的数据增强Demo☆48Updated 5 years ago
- Adversarial Attack文本匹配比赛☆42Updated 6 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 3 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆77Updated 5 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- 法研杯CAIL2019阅读 理解赛题参赛模型☆42Updated 6 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆128Updated 5 years ago
- 2019达观杯 第六名代码☆44Updated 2 years ago
- ☆92Updated 5 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 6 years ago
- 这是使用pytoch 实现的长文本分类器☆46Updated 6 years ago
- 相似案例匹配☆48Updated 6 years ago
- 天池-新冠疫情相似句对判定大赛 Rank8☆52Updated 5 years ago
- 天池中药说明书实体识别挑战冠军方案;中文命名实体识别;NER; BERT-CRF & BERT-SPAN & BERT-MRC;Pytorch☆24Updated 4 years ago
- CCKS2020面向金融领域的小样本跨类迁移事件抽取baseline☆56Updated 3 years ago
- 法研杯2019 阅读理解赛道 top3☆151Updated 2 years ago
- 开天-新词,中文新词发现工具,Chinese New Word Discovery Tool☆22Updated 6 years ago
- use ELMo in chinese environment☆105Updated 7 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆13Updated 2 years ago
- 2019 CAIL 法研杯机器阅读理解挑战赛 第8名 解决方案☆16Updated 6 years ago
- 基于BERT的无监督分词和句法分析☆109Updated 5 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 6 years ago
- ☆31Updated 6 years ago
- 5st place solution for competition Duplication Question Detection based on Adversarial Attack☆39Updated 5 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆82Updated 5 years ago
- keras implement of dgcnn for reading comprehension☆164Updated 6 years ago
- 个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0☆42Updated 5 years ago
- ☆19Updated 2 years ago
- 莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案☆40Updated 6 years ago