TensorFlow code and pre-trained models for BERT
☆39,875Jul 23, 2024Updated last year
Alternatives and similar repositories for bert
Users that are interested in bert are comparing it to the libraries listed below
Sorting:
- 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal model…☆157,071Updated this week
- 🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP☆12,817Jan 23, 2024Updated 2 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,176May 28, 2023Updated 2 years ago
- An open-source NLP research library, built on PyTorch.☆11,889Nov 22, 2022Updated 3 years ago
- Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the mo…☆22,981Jul 28, 2024Updated last year
- Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)☆10,175Jul 15, 2025Updated 7 months ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,170Sep 30, 2025Updated 5 months ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆17,012Jun 2, 2023Updated 2 years ago
- Google AI 2018 BERT pytorch implementation☆6,519Sep 15, 2023Updated 2 years ago
- 100+ Chinese Word Vectors 上百种预训练中文词向量☆12,183Oct 30, 2023Updated 2 years ago
- Library for fast text representation and classification.☆26,502Mar 22, 2024Updated last year
- Models and examples built with TensorFlow☆77,684Feb 13, 2026Updated 2 weeks ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆24,635Aug 14, 2024Updated last year
- The official repository for ERNIE 4.5 and ERNIEKit – its industrial-grade development toolkit based on PaddlePaddle.☆7,681Jan 4, 2026Updated last month
- all kinds of text classification models and more with deep learning☆7,951Sep 28, 2023Updated 2 years ago
- Google Research☆37,367Updated this week
- A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型☆3,983Nov 21, 2022Updated 3 years ago
- A very simple framework for state-of-the-art Natural Language Processing (NLP)☆14,359Oct 27, 2025Updated 4 months ago
- Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services☆4,900Feb 24, 2021Updated 5 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,490Jan 14, 2026Updated last month
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations☆3,276Apr 14, 2023Updated 2 years ago
- Unsupervised text tokenizer for Neural Network-based text generation.☆11,668Updated this week
- 大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP☆9,855Feb 6, 2026Updated 3 weeks ago
- A library for efficient similarity search and clustering of dense vectors.☆39,195Updated this week
- A natural language modeling framework based on PyTorch☆6,305Oct 17, 2022Updated 3 years ago
- State-of-the-Art Text Embeddings☆18,298Feb 20, 2026Updated last week
- 💫 Industrial-strength Natural Language Processing (NLP) in Python☆33,254Nov 27, 2025Updated 3 months ago
- Deep Learning for humans☆63,866Updated this week
- Topic Modelling for Humans☆16,361Nov 1, 2025Updated 3 months ago
- Tensors and Dynamic neural networks in Python with strong GPU acceleration☆97,688Updated this week
- An Open Source Machine Learning Framework for Everyone☆193,905Updated this week
- Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on sing…☆28,035Updated this week
- Natural Language Processing Tutorial for Deep Learning Researchers☆14,857Feb 21, 2024Updated 2 years ago
- keras implement of transformers for humans☆5,421Nov 11, 2024Updated last year
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,648Updated this week
- PyTorch original implementation of Cross-lingual Language Model Pretraining.