0oTedo0 / NLP-BeginnerLinks
Exercises of Natural Language Process.
☆33Updated 3 years ago
Alternatives and similar repositories for NLP-Beginner
Users that are interested in NLP-Beginner are comparing it to the libraries listed below
Sorting:
- 复旦大学邱锡鹏老师推荐的nlp-beginner项目的实现代码☆129Updated 4 years ago
- SimCSE中文语义相似度对比学习模型☆85Updated 3 years ago
- NLP常见任务实现(pytorch版)☆122Updated 4 years ago
- SimCSE有监督与无监督实验复现☆149Updated last year
- ☆19Updated 4 years ago
- 基于pytorch + bert的多标签文本分类(multi label text classification)☆103Updated last year
- 记录经典NER模型,目前仓库包含如下模型代码:BERT, LSTM, GlobalPointer, CRF, HMM☆34Updated 2 years ago
- bert_seq2seq的DDP版本,支持bert、roberta、nezha、t5、gpt2等模型,支持seq2seq、ner、关系抽取等任务,无需添加额外代码,轻松启动DDP多卡训练。☆52Updated 2 years ago
- ☆39Updated 2 years ago
- ☆278Updated 3 years ago
- 中文数据集下SimCSE+ESimCSE的实现☆192Updated 3 years ago
- code for nlp beginner, including Sentiment Analysis, NER, NLI and Language Model.☆55Updated 5 years ago
- 基于prompt的中文文本分类。☆55Updated 2 years ago
- 自然语言处理NLP(自然语言生成NLG、自然语言理解NLU)、自然语言学术会议大盘点、自然语言大佬介绍、NLP研究机构、NLP资料分享、NLP学习资源分享、NLP学术论文介绍☆185Updated 2 months ago
- SimCSE在中文上的复现,有监督+无监督☆277Updated 3 months ago
- A PyTorch implementation of a BiLSTM \ BERT \ Roberta (+ BiLSTM + CRF) model for Chinese Word Segmentation (中文分词) .☆210Updated 2 years ago
- 基于pytorch_bert的中文多标签分类☆91Updated 3 years ago
- Implemention of NER model on chinese dataset.☆73Updated 2 years ago
- 该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记【信息抽取篇】☆29Updated 2 years ago
- This is a repository for a few projects built in torch.☆43Updated 3 years ago
- 神经网络各种模型PyTorch实现☆40Updated 2 years ago
- 多模型中文cnews新闻文本分类☆55Updated 5 years ago
- 信息抽取相关论文。☆73Updated 2 years ago
- 基于scrapy的层次优先队列方法爬取中文维基百科,并自动抽取结构和半结构数据☆152Updated 2 years ago
- 苏神SPACE pytorch版本复现☆42Updated 3 years ago
- Using BERT+Bi-LSTM+CRF☆139Updated 3 years ago
- 基于Hmm模型和Viterbi算法实现中文分词及词性标注,使用最大概率算法进行优化。人民日报语料:分词(F1:96.189%);词性标注(F1:97.934%)☆26Updated 2 years ago
- 使用LoRA对ChatGLM进行微调。☆49Updated last year
- 🗺️ 一个自然语言处理的学习路线图☆109Updated 2 years ago
- 2021CCF BDCI 新闻摘要自动生成☆18Updated 3 years ago