CyberZHG / keras-xlnetLinks
Implementation of XLNet that can load pretrained checkpoints
☆171Updated 3 years ago
Alternatives and similar repositories for keras-xlnet
Users that are interested in keras-xlnet are comparing it to the libraries listed below
Sorting:
- 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large☆231Updated 6 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Updated 5 years ago
- export bert model for serving☆141Updated 6 years ago
- 将百度ernie的paddlepaddle模型转成tensorflow模型☆178Updated 6 years ago
- Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALB…☆13Updated 2 years ago
- Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".☆340Updated 6 years ago
- 基于BERT的中文序列标注☆141Updated 7 years ago
- A Lite BERT☆60Updated 6 years ago
- TensorFlow code and pre-trained models for BERT and ERNIE☆148Updated 6 years ago
- Ner with Bert☆281Updated 6 years ago
- Deep contextualized word representations for Chinese☆151Updated 5 years ago
- ALBERT model Pretraining and Fine Tuning using TF2.0☆204Updated 2 years ago
- Byte Cup 2018 International Machine Learning Contest (3rd prize)☆77Updated 3 years ago
- use ELMo in chinese environment☆105Updated 7 years ago
- keras implement of dgcnn for reading comprehension☆164Updated 6 years ago
- Feel free to fine tune large BERT models with Multi-GPU and FP16 support.☆192Updated 5 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆100Updated 5 years ago
- Hierarchically-Refined Label Attention Network for Sequence Labeling☆293Updated 4 years ago
- transformer crf 命名实体识别☆107Updated 6 years ago
- Data Augmentation for NLP. NLP数据增强☆295Updated 4 years ago
- Sequence labeling base on universal transformer (Transformer encoder) and CRF; 基于Universal Transformer + CRF 的中文分词和词性标注☆163Updated 6 years ago
- Adversarial Transfer Learning for Chinese Named Entity Recognition with Self-Attention Mechanism☆202Updated 7 years ago
- ☆535Updated 6 years ago
- ☆280Updated 4 years ago
- Transformer-based models implemented in tensorflow 2.x(using keras).☆76Updated 3 years ago
- 2019年百度的实体链指比赛(ccks2019),一个baseline☆112Updated 6 years ago
- 2019年百度的三元组抽取比赛,一个baseline☆208Updated 6 years ago
- 利用预训练的中文模型实现基于bert的语义匹配模型 数据集为LCQMC官方数据☆199Updated 5 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆95Updated 5 years ago
- 1. Use BERT, ALBERT and GPT2 as tensorflow2.0's layer. 2. Implement GCN, GAN, GIN and GraphSAGE based on message passing.☆338Updated last year