Lisennlp / two_sentences_classifier
Bert分类,语义相似度,获取句向量。
☆65Updated 2 months ago
Alternatives and similar repositories for two_sentences_classifier:
Users that are interested in two_sentences_classifier are comparing it to the libraries listed below
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 4 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆140Updated 4 years ago
- implementation SlotGated SLU model for keras☆34Updated 3 years ago
- ☆89Updated 4 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆88Updated last year
- WordMultiSenseDisambiguation, chinese multi-wordsense disambiguation based on online bake knowledge base and semantic embedding similarit…☆127Updated 6 years ago
- Keras solution of simple Knowledge-Based QA task with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT☆20Updated last year
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- transformers implement (architecture, task example, serving and more)☆96Updated 2 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆101Updated 4 years ago
- 用BERT在百度WebQA中文问答数据集上做阅读问答☆65Updated 4 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- 本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。☆80Updated 3 years ago
- 2020语言与智能技术竞赛:关系抽取任务☆65Updated 4 years ago
- ☆91Updated 4 years ago
- 基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域☆114Updated last year
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆89Updated 4 years ago
- Pattern-Exploiting Training在中文上的简单实验☆170Updated 4 years ago
- code for ACL2020:《FLAT: Chinese NER Using Flat-Lattice Transformer》 我注释&修改&添加了部分源码,使得大家更容易复现这个代码。☆56Updated 4 years ago
- ☆88Updated 3 years ago
- SMP2018中文人机对话技术评测(ECDT)☆47Updated 6 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated last year
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- albert + lstm + crf实体识别,pytorch实现。识别的主要实体是人名、地名、机构名和时间。albert + lstm + crf (named entity recognition)☆136Updated 2 years ago
- A full-process dialogue system that can be deployed online☆98Updated 2 years ago
- 基于预训练模型 BERT 的阅读理解☆93Updated last year
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- Use deep models including BiLSTM, ABCNN, ESIM, RE2, BERT, etc. and evaluate on 5 Chinese NLP datasets: LCQMC, BQ Corpus, ChineseSTS, OCN…☆76Updated 2 years ago
- NLP实验:新词挖掘+预训练模型继续Pre-training☆47Updated last year
- 句子匹配模型,包括无监督的SimCSE、ESimCSE、PromptBERT,和有监督的SBERT、CoSENT。☆98Updated 2 years ago