TinkerMob / keras_albert_model
A Lite BERT
☆59Updated 4 years ago
Related projects: ⓘ
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆12Updated last year
- CLUE baseline pytorch CLUE的pytorch版本基线☆73Updated 4 years ago
- ☆78Updated 5 years ago
- use ELMo in chinese environment☆104Updated 5 years ago
- 基于BERT的中文序列标注☆142Updated 5 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 3 years ago
- 达观算法比赛ner任务,从重新训练bert,到finetune预测。☆76Updated last year
- NLP related tasks, including text classification, sequence annotation, text relations, machine translation and other tasks.☆66Updated 4 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 4 years ago
- 将百度ernie的paddlepaddle模型转成tensorflow模型☆177Updated 4 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆97Updated 4 years ago
- ☆89Updated 4 years ago
- Tensorflow solution of NER task Using BiLSTM-CRF model with CMU/Google XLNet☆45Updated 4 years ago
- ☆125Updated this week
- 一条命令产生bert、albert句向量,用于相似度计算和文本分类等。☆34Updated last year
- Implementation of XLNet that can load pretrained checkpoints☆172Updated 2 years ago
- NLP的数据增强Demo☆47Updated 4 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆179Updated 4 years ago
- bert/albert/roberta特征抽取服务端,基于bert-as-service,新增albert模型。☆0Updated last year
- ☆91Updated 4 years ago
- transformer crf 命名实体识别☆104Updated 5 years ago
- A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)☆125Updated last year
- baseline for ccks2019-ipre☆49Updated 4 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆87Updated last year
- Bert-classification and bert-dssm implementation with keras.☆92Updated 4 years ago
- Final Project for EECS496-7☆63Updated 5 years ago
- 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large☆228Updated 5 years ago
- 相似案例匹配☆46Updated 4 years ago
- 使用BERT做文本相似度☆65Updated 4 years ago
- Pytorch-BERT-CRF-NER;Chinese-Named-Entity-Recognition☆46Updated 3 years ago