WenRichard / Light_SimMatch
面向中文领域的轻量文本匹配框架,集成文本匹配,文本蕴含,释义识别等领域的各个经典,STA模型
☆25Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for Light_SimMatch
- ☆23Updated 5 years ago
- BDCI2019-互联网金融新实体发现-第7名(本可top3)☆18Updated 4 years ago
- 适用于常见的NLP任务的模板☆34Updated last year
- 天池-新冠疫情相似句对判定大赛 大白_Rank6☆22Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆27Updated 5 years ago
- 2020语言与智能技术竞赛:面向推荐的对话任务☆51Updated 3 years ago
- 基于BERT的中文命名实体识别(pytorch)☆18Updated 5 years ago
- tensorflow version of bert-of-theseus☆63Updated 3 years ago
- The code for "A Unified MRC Framework for Named Entity Recognition"☆34Updated 5 years ago
- ☆59Updated 5 years ago
- 2021搜狐校园文本匹配算法大赛baseline☆45Updated 3 years ago
- The very easy BERT pretrain process by using tokenizers and transformers repos☆31Updated 4 years ago
- Label Mask for Multi-label Classification☆55Updated 3 years ago
- 24*2个预训练的小型BERT模型,NLPer炼丹利器☆51Updated 4 years ago
- 开天-新词,中文新词发现工具,Chinese New Word Discovery Tool☆20Updated 4 years ago
- Coupling Distant Annotation and Adversarial Training for Cross-Domain Chinese Word Segmentation☆22Updated 4 years ago
- Knowledge Distillation from BERT☆51Updated 5 years ago
- 天池-新冠疫情相似句对判定大赛 Rank8☆52Updated 4 years ago
- Seq2seqAttGeneration, an basic implementation of text generation that using seq2seq attention model to generate poem series. this project…☆17Updated 3 years ago
- Source code for paper "LET: Linguistic Knowledge Enhanced Graph Transformer for Chinese Short Text Matching", AAAI2021.☆48Updated 3 years ago
- 2020智源-京东多模态对话(JDDC2020)第三名解决方案分享☆41Updated 4 years ago
- ☆29Updated 5 years ago
- 基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。☆16Updated 4 years ago
- 2021海华AI挑战赛·中文阅读理解·技术组☆20Updated 2 years ago
- ☆25Updated 5 years ago
- ☆49Updated 3 years ago
- 无监督文本生成的一些方法☆49Updated 3 years ago
- 本项目是CCKS2020实体链指比赛baseline(pytorch)☆18Updated 4 years ago