DDigimon / TCAMP-WEEK2
好未来第二周-自动评分
☆30Updated 7 years ago
Alternatives and similar repositories for TCAMP-WEEK2:
Users that are interested in TCAMP-WEEK2 are comparing it to the libraries listed below
- SMP2018中文人机对话技术评测(ECDT)☆47Updated 6 years ago
- 论文实现:《Chinese Grammatical Error Diagnosis with Long Short-Term Memory Networks》☆49Updated 6 years ago
- self complemented SpellCorrection based pinyin similairity, edit distance ,基于拼音相似度与编辑距离的查询纠错。☆82Updated 2 years ago
- A Chinese word segment model based on BERT, F1-Score 97%☆92Updated 5 years ago
- 使用HMM模型实现的机构名实体识别☆46Updated 6 years ago
- 基于字符训练词向量☆88Updated 6 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- Final Project for EECS496-7☆62Updated 6 years ago
- bert_chinese☆39Updated 2 years ago
- CGED & CSC☆22Updated 5 years ago
- A curated list of resources of chinese corpora for NLP(Natural Language Processing)☆74Updated 5 years ago
- Tensorflow solution of NER task Using BiLSTM-CRF model with CMU/Google XLNet☆45Updated 5 years ago
- Blog: https://www.cnblogs.com/llhthinker/p/8978029.html☆41Updated 6 years ago
- ☆89Updated 4 years ago
- 使用ALBERT预训练模型,用于识别文本中的时间,同时验证模型的预测耗时是否有显著提升。☆56Updated 5 years ago
- QA、CHAT、Task-Oriented简单的demo实现☆26Updated 5 years ago
- CCKS 2018 开放领域的中文问答任务 1st 解决方案☆109Updated 5 years ago
- ☆74Updated 5 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 4 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆98Updated 5 years ago
- This repo contains some experiments of text matching on Chinese dataset LCQMC☆27Updated 5 years ago
- ☆18Updated 2 years ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆74Updated 4 years ago
- 基于Elasticsearch的KBQA☆55Updated 6 years ago
- 中文文本纠错模型,keras实现☆73Updated 3 years ago
- 基于知识库的开放域问答系统的相关工作☆69Updated 6 years ago
- Chinese Grammatical Error Diagnosis☆22Updated 6 years ago
- Word similarity computation based on Tongyici Cilin☆119Updated 7 years ago
- Pytorch-BERT-CRF-NER;Chinese-Named-Entity-Recognition☆46Updated 3 years ago
- Code for chinese error detection module, using n-gram and bi-lstm☆135Updated 6 years ago