huggingface / transformers
π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
β135,166Updated this week
Related projects β
Alternatives and complementary repositories for transformers
- TensorFlow code and pre-trained models for BERTβ38,213Updated 3 months ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β30,552Updated last month
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β35,508Updated this week
- Build and share delightful machine learning apps, all in Python. π Star to support our work!β34,030Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ20,199Updated 3 months ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ20,194Updated last week
- State-of-the-Art Text Embeddingsβ15,368Updated this week
- Tensors and Dynamic neural networks in Python with strong GPU accelerationβ84,234Updated this week
- π€ Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.β26,242Updated this week
- A library for efficient similarity search and clustering of dense vectors.β31,488Updated this week
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an imageβ25,990Updated 3 months ago
- π€ The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation toolsβ19,269Updated this week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β7,958Updated this week
- Ongoing research training transformer models at scaleβ10,595Updated this week
- π« Industrial-strength Natural Language Processing (NLP) in Pythonβ30,265Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β6,952Updated last year
- Visualizer for neural network, deep learning and machine learning modelsβ28,167Updated this week
- Code for the paper "Language Models are Unsupervised Multitask Learners"β22,546Updated 3 months ago
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β16,471Updated this week
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β13,580Updated 3 months ago
- An open-source NLP research library, built on PyTorch.β11,759Updated last year
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ9,052Updated this week
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.β28,423Updated this week
- Open standard for machine learning interoperabilityβ17,949Updated this week
- Fast and memory-efficient exact attentionβ14,279Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.β10,295Updated 2 weeks ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β15,560Updated last year
- Code and documentation to train Stanford's Alpaca models, and generate the data.β29,561Updated 4 months ago
- π¦π Build context-aware reasoning applicationsβ95,070Updated this week
- Making large AI models cheaper, faster and more accessibleβ38,821Updated this week