chdd / bert-utilsLinks
一行代码使用BERT生成句向量,BERT做文本分类、文本相似度计算
☆10Updated 5 years ago
Alternatives and similar repositories for bert-utils
Users that are interested in bert-utils are comparing it to the libraries listed below
Sorting:
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- bert_chinese☆39Updated 2 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated 2 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- 中国中文信息学会社会媒体处理专业委员会举办的2019届中文人机对话之自然语言理解竞赛☆74Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- 实体链接demo☆65Updated 6 years ago
- 2019语言与智能技术竞赛-基于知识图谱的主动聊天☆115Updated 6 years ago
- use ELMo in chinese environment☆104Updated 6 years ago
- NLP的数据增强Demo☆47Updated 5 years ago
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆74Updated 2 years ago
- 2019达观杯 第六名代码☆43Updated 2 years ago
- transformer crf 命名实体识别☆106Updated 6 years ago
- SMP2018中文人机对话技术评测(ECDT)☆47Updated 6 years ago
- Performance comparison between Chinese word segmentation and part-of-speech tagging tools☆58Updated 5 years ago
- 中文预训练模型生成字向 量学习,测试BERT,ELMO的中文效果☆99Updated 5 years ago
- 相似案例匹配☆46Updated 5 years ago
- ccks_2019_百度实体链接技术比赛_第一名解决方案☆57Updated 5 years ago
- 天池大赛疫情文本挑战赛☆49Updated 5 years ago
- 机器检索阅读联合学习,莱斯杯:全国第二届“军事智能机器阅读”挑战赛 rank6 方案☆127Updated 4 years ago
- Keras solution of simple Knowledge-Based QA task with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT☆21Updated 2 years ago
- NLP related tasks, including text classification, sequence annotation, text relations, machine translation and other tasks.☆67Updated 5 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆89Updated 2 years ago
- Bert中文文本分类☆40Updated 6 years ago
- ☆91Updated 5 years ago
- ☆17Updated 6 years ago
- CCKS 2018 开放领域的中文问答任务 1st 解决方案☆110Updated 6 years ago
- 使用ALBERT预训练模型,用于识别文本中的时间,同时验证模型的预测耗时是否有显著提升。☆56Updated 5 years ago
- 使用bert做领域分类、意图识别和槽位填充任务☆76Updated 5 years ago