xiaoxiong74 / Time-Extractor
中文文本时间抽取、时间转换及标准化
☆51Updated 4 years ago
Alternatives and similar repositories for Time-Extractor:
Users that are interested in Time-Extractor are comparing it to the libraries listed below
- 对话改写介绍文章☆95Updated last year
- ACL 2019论文复现:Improving Multi-turn Dialogue Modelling with Utterance ReWriter☆129Updated 4 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 4 years ago
- CLUEWSC2020: WSC Winograd模式挑战中文版,中文指代消解任务☆71Updated 4 years ago
- Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension☆166Updated 2 years ago
- Bert finetune for CMRC2018, CJRC, DRCD, CHID, C3☆182Updated 4 years ago
- ☆126Updated 2 years ago
- ☆101Updated 4 years ago
- ☆214Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆89Updated 5 years ago
- 在bert4keras下加载CPM_LM模型☆51Updated 4 years ago
- 使用bert做领域分类、意图识别和槽位填充任务☆74Updated 4 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆90Updated 5 years ago
- 中文版unilm预训练模型☆83Updated 3 years ago
- Time-NLP的Python3版本 中文时间表达识别☆87Updated 4 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆126Updated 4 years ago
- lasertagger-chinese;lasertagger中文学习案例,案例数据,注释,shell运行☆74Updated last year
- ☆89Updated 4 years ago
- Source code for the paper "PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction" in ACL2021☆233Updated 2 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆90Updated 4 years ago
- JDDC 2019 并列亚军(第三名)“网数ICT小分队”的检索模型部分☆45Updated last year
- 基于预训练模型 BERT 的阅读理解☆92Updated last year
- A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)☆126Updated 2 years ago
- transformers implement (architecture, task example, serving and more)☆96Updated 2 years ago
- ☆91Updated 4 years ago
- Final Project for EECS496-7☆62Updated 5 years ago
- 李傲龍的博客☆81Updated 6 months ago
- 整理一下在keras中使用T5模型的要点☆172Updated 2 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 4 years ago