huawei-noah / Pretrained-Language-ModelLinks
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
☆3,148Updated last year
Alternatives and similar repositories for Pretrained-Language-Model
Users that are interested in Pretrained-Language-Model are comparing it to the libraries listed below
Sorting:
- Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo☆3,100Updated last year
- A PyTorch-based knowledge distillation toolkit for natural language processing☆1,686Updated 2 years ago
- A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型☆3,989Updated 3 years ago
- bert nlp papers, applications and github resources, including the newst xlnet , BERT、XLNet 相关论文和 github 项目☆1,849Updated 4 years ago
- ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators☆2,365Updated last year
- [EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821☆3,612Updated last year
- Implementation of BERT that could load official pre-trained models for feature extraction and prediction☆2,427Updated 3 years ago
- Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities"☆1,420Updated last year
- An Open-source Neural Hierarchical Multi-label Text Classification Toolkit☆1,912Updated last week
- RoBERTa中文预训练模型: RoBERTa for Chinese☆2,760Updated last year
- Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)☆1,438Updated 4 months ago
- Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard☆1,780Updated 2 years ago
- Multi-Task Deep Neural Networks for Natural Language Understanding☆2,258Updated last year
- Must-read Papers on pre-trained language models.☆3,364Updated 3 years ago
- EasyTransfer is designed to make the development of transfer learning in NLP applications easier.☆861Updated 3 years ago
- Pre-Trained Chinese XLNet(中文XLNet预训练模型)☆1,651Updated 4 months ago
- Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includ…☆2,387Updated last year
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations☆3,273Updated 2 years ago
- novel deep learning research works with PaddlePaddle☆1,751Updated last year
- keras implement of transformers for humans☆5,418Updated last year
- 中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard☆4,211Updated 2 months ago
- A Lite Bert For Self-Supervised Learning Language Representations☆718Updated 5 years ago
- ☆3,679Updated 3 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,178Updated 2 years ago
- fastNLP: A Modularized and Extensible NLP Framework. Currently still in incubation.☆3,143Updated 2 years ago
- Datasets, SOTA results of every fields of Chinese NLP☆1,811Updated 3 years ago
- Chinese NER(Named Entity Recognition) using BERT(Softmax, CRF, Span)☆2,227Updated 2 years ago
- The score code of FastBERT (ACL2020)☆608Updated 4 years ago
- DeepIE: Deep Learning for Information Extraction☆1,944Updated 2 years ago
- The official repository for ERNIE 4.5 and ERNIEKit – its industrial-grade development toolkit based on PaddlePaddle.☆7,615Updated this week