nilboy / gaic_track3_pair_sim
全球人工智能技术创新大赛-赛道三-冠军方案
☆237Updated 3 years ago
Alternatives and similar repositories for gaic_track3_pair_sim:
Users that are interested in gaic_track3_pair_sim are comparing it to the libraries listed below
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆262Updated 3 years ago
- 天池大赛疫情文本挑战赛线上第三名方案分享☆229Updated 4 years ago
- ccf 2020 qa match competition top1☆266Updated 4 years ago
- 小布助手对话短文本语义匹配的一个baseline☆139Updated 3 years ago
- implementation several deep text match (text similarly) models for keras . cdssm, arc-ii,match_pyramid, mvlstm ,esim, drcn ,bimpm, bert, …☆291Updated 4 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆175Updated 3 years ago
- ☆277Updated 2 years ago
- CoSENT、STS、SentenceBERT☆163Updated last week
- 中文NLP数据集☆153Updated 5 years ago
- Knowledge Graph☆170Updated 2 years ago
- pytorch中文语言模型预训练☆389Updated 4 years ago
- experiments of some semantic matching models and comparison of experimental results.☆161Updated last year
- 从头训练MASK BERT☆136Updated 2 years ago
- 中文问题句子相似度计算比赛及方案汇总☆296Updated 4 years ago
- 真 · “Deep Learning for Humans”☆141Updated 3 years ago
- “英特尔创新大师杯”深度学习挑战赛 赛道2:CCKS2021中文NLP地址要素解析☆143Updated 3 years ago
- Pattern-Exploiting Training在中文上的简单实验☆170Updated 4 years ago
- SimCSE在中文上的复现,有监督+无监督☆272Updated 3 years ago
- 全局指针统一处理嵌套与非嵌套NER☆254Updated 3 years ago
- GAIIC2022商品标题实体识别Baseline,使用GlobalPointer实现,线上0.80349☆53Updated 2 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆139Updated 4 years ago
- 整理一下在keras中使用T5模型的要点☆172Updated 2 years ago
- A baseline for WenTianSearch☆85Updated 2 years ago
- 天池 疫情相似句对判定大赛 线上第一名方案☆432Updated 4 years ago
- 法研杯2019相似案例匹配第二名解决方案(附数据集和文档),CAIL2020/2021司法考试赛道冠军队伍☆246Updated 3 years ago
- ☆88Updated 3 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆191Updated 2 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆114Updated last year
- 中文无监督SimCSE Pytorch实现☆133Updated 3 years ago
- ☆155Updated 3 years ago