wanglke / SiameseNetLinks
☆19Updated 4 years ago
Alternatives and similar repositories for SiameseNet
Users that are interested in SiameseNet are comparing it to the libraries listed below
Sorting:
- 使用多头的思想来进行命名实体识别☆33Updated 4 years ago
- NLP实验:新词挖掘+预训练模型继续Pre-training☆47Updated last year
- multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search☆32Updated 3 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆37Updated 4 years ago
- 复习论文《A Frustratingly Easy Approach for Joint Entity and Relation Extraction》☆31Updated 4 years ago
- 多标签文本分类☆30Updated 3 years ago
- 文本分类baseline:BERT、半监督学习UDA、对抗学习、数据增强☆102Updated 4 years ago
- 2021搜狐校园文本匹配算法大赛Top2方案☆36Updated last year
- Label Mask for Multi-label Classification☆56Updated 3 years ago
- 2021 搜狐校园文本匹配算法大赛方案☆17Updated 6 months ago
- CCKS 2020:面向金融领域的小样本跨类迁移事件抽取。该项目实现基于MRC的事件抽取方法☆39Updated 2 years ago
- CCKS2020面向金融领域的小样本跨类迁移事件抽取baseline☆55Updated 2 years ago
- WoBERT Pytorch 版本,中文词汇级Bert:WoBERT学习☆21Updated 4 years ago
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆99Updated 2 years ago
- Multi-Label Text Classification Based On Bert☆22Updated 2 years ago
- ccks2021事件抽取比赛☆30Updated 3 years ago
- NER任务SOTA模型BERT_MRC☆61Updated last year
- 2020语言与智能技术竞赛:事件抽取任务方案代码☆28Updated 2 years ago
- 百度2020语言与智能技术竞赛:事件抽取赛道方案代码☆53Updated 4 years ago
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务应用。☆47Updated 4 years ago
- 复现论文《Simplify the Usage of Lexicon in Chinese NER》☆42Updated 4 years ago
- 天池-新冠疫情相似句对判定大赛 Rank8☆52Updated 5 years ago
- ☆38Updated 5 years ago
- 中文文本句对相似度匹配-ATEC数据集☆22Updated 4 years ago
- DataFountain第五届达观杯第4名方案☆50Updated 2 years ago
- ccks2020基于本体的金融知识图谱自动化构建技术评测第五名方法总结☆50Updated 2 years ago
- 本项目是CCKS2020实体链指比赛baseline(pytorch)☆19Updated 4 years ago
- CCKS2020 面向中文短文本的实体链指任务。主要思路为:使用基于BiLSTM和Attention的语义模型进行Query和Doc的文本匹配,再针对匹配度进行pairwise排序,从而选出最优的知识库实体。☆47Updated 4 years ago
- bert-flat 简化版 添加了很多注释☆15Updated 3 years ago
- 基于pytorch的TPLinker_plus进行中文命名实体识别☆18Updated 2 years ago