dongxiaohuang / TextClassifier_TransformerLinks
个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0
☆42Updated 5 years ago
Alternatives and similar repositories for TextClassifier_Transformer
Users that are interested in TextClassifier_Transformer are comparing it to the libraries listed below
Sorting:
- 将百度ernie的paddlepaddle模型转成tensorflow模型☆177Updated 5 years ago
- use ELMo in chinese environment☆104Updated 6 years ago
- ☆279Updated 4 years ago
- DIAC2019基于Adversarial Attack的问题等价性判别比赛☆81Updated 5 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- First place solution of WSDM CUP 2020, pairwise-bert, lightgbm☆89Updated 5 years ago
- ☆11Updated 5 years ago
- 2018-JDDC大赛第4名的解决方案☆236Updated 6 years ago
- Bert-classification and bert-dssm implementation with keras.☆93Updated 5 years ago
- 转 换 https://github.com/brightmart/albert_zh 到google格式☆62Updated 4 years ago
- bert multiple gpu train pretrain☆29Updated 5 years ago
- 基于BERT的中文序列标注☆141Updated 6 years ago
- NLP的数据增强Demo☆47Updated 5 years ago
- TensorFlow code and pre-trained models for BERT and ERNIE☆146Updated 6 years ago
- BERT distillation(基于BERT的蒸馏实验 )☆313Updated 4 years ago
- Adversarial Attack文本匹配比赛☆42Updated 5 years ago
- 5st place solution for competition Duplication Question Detection based on Adversarial Attack☆39Updated 5 years ago
- Our experience & lesson & code☆48Updated 8 years ago
- Rank2 solution (no-BERT) for 2019 Language and Intelligence Challenge - DuReader2.0 Machine Reading Comprehension.☆127Updated 5 years ago
- Entity recognition codes for "2019 Datagrand Cup: Text Information Extraction Challenge"☆26Updated 5 years ago
- ☆91Updated 5 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆99Updated 5 years ago
- 2019达观杯 第六名代码☆43Updated 2 years ago
- CCKS 2018 开放领域的中文问答任务 1st 解决方案☆110Updated 6 years ago
- 利用预训练的中文模型实现基于bert的语义匹配模型 数据集为LCQMC官方数据☆198Updated 5 years ago
- CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现☆28Updated 5 years ago
- 基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。☆58Updated 3 years ago
- 关于文本分类的许多方法,主要涉及到TextCNN,TextRNN, LEAM, Transformer,Attention, fasttext, HAN等☆75Updated 6 years ago
- 小布助手对话短文本语义匹配的一个baseline☆139Updated 4 years ago