liu-nlper / AhoCorasickAutomationLinks
Python version Aho-Corasic Automaton.
☆18Updated 4 years ago
Alternatives and similar repositories for AhoCorasickAutomation
Users that are interested in AhoCorasickAutomation are comparing it to the libraries listed below
Sorting:
- NLP的数据增强Demo☆48Updated 5 years ago
- 微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)☆141Updated 5 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆76Updated 5 years ago
- NLP related tasks, including text classification, sequence annotation, text relations, machine translation and other tasks.☆68Updated 5 years ago
- BERT预训练模型字向量提取工具☆53Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆129Updated 4 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆90Updated 5 years ago
- 这是一个用于解决生成在生成任务中(翻译,复述等等),多样性不足问题的模型。☆46Updated 6 years ago
- 关键词抽取项目☆24Updated 4 years ago
- 使用python实现了一个简单的trie树结构,可增加/查找/删除关键词,用于中文文本的关键词匹配、停用词删除等。☆65Updated 5 years ago
- ☆92Updated 5 years ago
- transformers implement (architecture, task example, serving and more)☆96Updated 3 years ago
- 一条命令产生bert、albert句向量,用于相似度计算和文本分类等。☆35Updated 2 years ago
- multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification☆28Updated 3 years ago
- CCF-BDCI大数据与计算智能大赛-互联网金融新实体发现-9th☆55Updated 5 years ago
- 利用预训练的中文模型实现基于bert的语义匹配模型 数据集为LCQMC官方数据☆198Updated 5 years ago
- 中文文本纠错模型,keras实现☆75Updated 4 years ago
- ☆280Updated 4 years ago
- Keras solution of simple Knowledge-Based QA task with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT☆22Updated 2 years ago
- lic2020关系抽取比赛,使用Pytorch实现苏神的模型。☆102Updated 4 years ago
- 对话改写介绍文章☆98Updated 2 years ago
- SMP2018中文人机对话技术评测(ECDT)☆47Updated 6 years ago
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆90Updated 6 years ago
- 基于BERT的无监督分词和句法分析☆110Updated 5 years ago
- 使用tf实现最新的中文命名实体识别模型☆14Updated 4 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆75Updated 2 years ago
- bert_chinese☆39Updated 2 years ago
- chinese pretrain unilm☆28Updated 5 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago