beyondguo / Learn_PyTorchLinks
learn jiu wan shier l
☆53Updated 3 years ago
Alternatives and similar repositories for Learn_PyTorch
Users that are interested in Learn_PyTorch are comparing it to the libraries listed below
Sorting:
- SimCSE有监督与无监督实验复现☆149Updated last year
- WoBERT_pytorch☆41Updated 4 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT 。☆100Updated 2 years ago
- This repository implements a prompt tuning model for hierarchical text classification. This work has been accepted as the long paper "HPT…☆67Updated last year
- 中文无监督SimCSE Pytorch实现☆135Updated 4 years ago
- 基于prompt的中文文本分类。☆55Updated 2 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆193Updated 3 years ago
- kpt code☆208Updated 2 years ago
- ☆33Updated 4 years ago
- 中文机器阅读理解数据集☆104Updated 4 years ago
- 中文bigbird预训练模型☆95Updated 3 years ago
- 记录NLP、CV、搜索、推荐等AI岗位最新情况。☆29Updated 2 years ago
- Pattern-Exploiting Training在中文上的简单实验☆174Updated 4 years ago
- 基于SpanBert的中文指代消解,pytorch实现☆101Updated 2 years ago
- ☆58Updated 2 years ago
- Code for our paper "Dual Contrastive Learning: Text Classification via Label-Aware Data Augmentation"☆162Updated 2 years ago
- 《Spelling Error Correction with Soft-Masked BERT》论文复现☆36Updated 3 years ago
- Survey of NLP+AI Conferences and Journals for NLPers☆41Updated 3 months ago
- ☆29Updated last year
- Data augmentation for NLP, accepted at EMNLP 2021 Findings☆105Updated last year
- 真 · “Deep Learning for Humans”☆142Updated 3 years ago
- 继续预训练中文bert☆31Updated 4 years ago
- 文本分类baseline:BERT、半监督学习UDA、对抗学习、数据增强☆104Updated 4 years ago
- ☆71Updated 3 years ago
- A concise implementation of SimCSE☆17Updated 4 years ago
- experiments of some semantic matching models and comparison of experimental results.☆163Updated 2 years ago
- The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction"☆229Updated 2 years ago
- CoSENT、STS、SentenceBERT☆170Updated 6 months ago
- 百度2021年语言与智能技术竞赛多形态信息抽取赛道事件抽取部分torch版baseline☆79Updated 4 years ago
- ☆48Updated 3 years ago