shawroad / CoSENT_PytorchLinks
CoSENT、STS、SentenceBERT
☆170Updated 7 months ago
Alternatives and similar repositories for CoSENT_Pytorch
Users that are interested in CoSENT_Pytorch are comparing it to the libraries listed below
Sorting:
- experiments of some semantic matching models and comparison of experimental results.☆163Updated 2 years ago
- NLP句子编码、句子embedding、语义相似度:BERT_avg、BERT_whitening、SBERT、SmiCSE☆178Updated 3 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆193Updated 3 years ago
- ☆279Updated 3 years ago
- SimCSE在中文上的复现,有监督+无监督☆278Updated 6 months ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆100Updated 2 years ago
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- Knowledge Graph☆175Updated 3 years ago
- ☆87Updated 3 years ago
- Pattern-Exploiting Training在中文上的简单实验☆174Updated 4 years ago
- 全局指针统一处理嵌套与非嵌套NER☆255Updated 4 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆262Updated 4 years ago
- 真 · “Deep Learning for Humans”☆142Updated 3 years ago
- SimCSE有监督与无监督实验复现☆149Updated last year
- A simple framework for building some basic NLP tasks☆60Updated 2 years ago
- 基于SpanBert的中文指代消解,pytorch实现☆101Updated 2 years ago
- chinese version of longformer☆115Updated 4 years ago
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆239Updated 3 years ago
- 中文NLP数据集☆158Updated 6 years ago
- ☆33Updated 4 years ago
- 基于Pytorch的文本分类框架,支持TextCNN、Bert、Electra等。☆63Updated 2 years ago
- 全球人工智能技术创新大赛-赛道三-冠军方案☆239Updated 4 years ago
- GAIIC2022商品标题实体识别Baseline,使用GlobalPointer实现,线上0.80349☆54Updated 3 years ago
- 论文复现《Named Entity Recognition as Dependency Parsing》☆131Updated 4 years ago
- ☆136Updated 3 years ago
- TPlinker for NER 中文/英文命名实体识别☆127Updated 4 years ago
- 3000000+语义理解与匹配数据集。可用于无监督对比学习、半监督学习等构建中文领域效果最好的预训练模型☆304Updated 2 years ago
- Hugging BERT together. Misc scripts for Huggingface transformers.☆72Updated 2 years ago
- 基于GlobalPointer的实体/关系/事件抽取☆149Updated 3 years ago
- 基于prompt的中文文本分类。☆55Updated 2 years ago