wac81 / textda
☆91Updated 4 years ago
Alternatives and similar repositories for textda:
Users that are interested in textda are comparing it to the libraries listed below
- NLP的数据增强Demo☆47Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆81Updated 5 years ago
- ☆89Updated 4 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 5 years ago
- 科赛网-莱斯杯:全国第二届“军事智能机器阅读”挑战赛 前十团队PPT文档代码总结☆131Updated 5 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- 基于BERT的无监督分词和句法分析☆110Updated 4 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated last year
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆101Updated 4 years ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆74Updated 5 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- ☆87Updated 3 years ago
- transformers implement (architecture, task example, serving and more)☆95Updated 3 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 4 years ago
- Pattern-Exploiting Training在中文上的简单实验☆170Updated 4 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Updated 3 years ago
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆129Updated last year
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- Chinese NLP Data Augmentation, BERT Contextual Augmentation☆111Updated 3 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- ☆127Updated 2 years ago
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 5 years ago
- 2019百度语言与智能 技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- tensorflow version of bert-of-theseus☆62Updated 4 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆91Updated 5 years ago