liushaoweihua / bert_encode_server
bert/albert/roberta特征抽取服务端,基于bert-as-service,新增albert模型。
☆0Updated 2 years ago
Alternatives and similar repositories for bert_encode_server
Users that are interested in bert_encode_server are comparing it to the libraries listed below
Sorting:
- 基于capsule的观点型阅读理解模型☆89Updated 5 years ago
- 基于BERT的中文序列标注☆141Updated 6 years ago
- ☆89Updated 4 years ago
- 转换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- tensorflow version of bert-of-theseus☆62Updated 4 years ago
- Rank2 solution (no-BERT) for 2019 Language and Intelligence Challenge - DuReader2.0 Machine Reading Comprehension.☆127Updated 5 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆65Updated 5 years ago
- use ELMo in chinese environment☆104Updated 6 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆92Updated 5 years ago
- Dataset for CIKM 2018 paper "Multi-Source Pointer Network for Product Title Summarization"☆73Updated 6 years ago
- 将百度ernie的paddlepaddle模型转成tensorflow模型☆177Updated 5 years ago
- ☆59Updated 5 years ago
- keras sparse implement of margin-softmax☆100Updated 6 years ago
- NLP的数据增强Demo☆47Updated 5 years ago
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- Tensorflow solution of NER task Using BiLSTM-CRF model with CMU/Google XLNet☆45Updated 5 years ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆74Updated 5 years ago
- TensorFlow code and pre-trained models for BERT and ERNIE☆145Updated 5 years ago
- 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large☆230Updated 5 years ago
- 2019达观杯 第六名代码☆43Updated 2 years ago
- First place solution of WSDM CUP 2020, pairwise-bert, lightgbm☆89Updated 5 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO 的中文效果☆99Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- CCKS 2019 Task 2: Entity Recognition and Linking☆94Updated 5 years ago
- 2020语言与智能技术竞赛-关系抽取-第三名方案☆28Updated 4 years ago
- CCKS 2018 开放领域的中文问答任务 1st 解决方案☆109Updated 5 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- Implementation of the cw2vec model☆28Updated 6 years ago
- 个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0☆42Updated 5 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago