xv44586 / toolkit4nlp
transformers implement (architecture, task example, serving and more)
☆95Updated 3 years ago
Alternatives and similar repositories for toolkit4nlp:
Users that are interested in toolkit4nlp are comparing it to the libraries listed below
- 中文版unilm预训练模型☆83Updated 4 years ago
- ☆89Updated 4 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- ☆90Updated 4 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 5 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆101Updated 4 years ago
- 基于“Seq2Seq+前缀树”的知识图谱问答☆71Updated 3 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆114Updated last year
- 法研杯2019 阅读理解赛道 top3☆150Updated last year
- 基于BERT的无监督分词和句法分析☆110Updated 4 years ago
- Bert分类,语义相似度,获取句向量。☆64Updated last month
- CCKS2019评测任务五-公众公司公告信息抽取,第3名☆121Updated 5 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆89Updated 4 years ago
- 对话改写介绍文章☆97Updated last year
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- baidu aistudio event extraction competition☆224Updated 2 years ago
- Pattern-Exploiting Training在中文上的简单实验☆171Updated 4 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- IPRE: a Dataset for Inter-Personal Relationship Extraction☆93Updated 5 years ago
- 天池-新冠疫情相似句对判定大赛 Rank8☆52Updated 4 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated last year
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 4 years ago
- 相似案例匹配☆46Updated 5 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆91Updated 5 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆89Updated last year
- ☆87Updated 3 years ago
- 中国法研杯CAIL2019要素抽取任务第三名方案分享☆138Updated 4 years ago