huggingface / transformersLinks
π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
β153,866Updated this week
Alternatives and similar repositories for transformers
Users that are interested in transformers are comparing it to the libraries listed below
Sorting:
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β32,020Updated 2 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β41,015Updated this week
- π€ Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch.β32,045Updated this week
- State-of-the-Art Text Embeddingsβ18,008Updated this week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,875Updated 5 months ago
- TensorFlow code and pre-trained models for BERTβ39,730Updated last year
- Unsupervised text tokenizer for Neural Network-based text generation.β11,508Updated this week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β16,818Updated 2 years ago
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β14,630Updated last year
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,377Updated this week
- π€ The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation toolsβ20,976Updated last week
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ34,299Updated this week
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ10,301Updated 2 weeks ago
- Tensors and Dynamic neural networks in Python with strong GPU accelerationβ95,775Updated this week
- Ongoing research training transformer models at scaleβ14,602Updated this week
- GPT-3: Language Models are Few-Shot Learnersβ15,775Updated 5 years ago
- Train transformer language models with reinforcement learning.β16,638Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ23,160Updated last year
- Inference code for Llama modelsβ58,983Updated 10 months ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"β24,472Updated last year
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,833Updated 6 months ago
- Build and share delightful machine learning apps, all in Python. π Star to support our work!β40,937Updated this week
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Autoβ¦β16,271Updated this week
- An open-source NLP research library, built on PyTorch.β11,886Updated 3 years ago
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β20,268Updated last week
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.β30,561Updated last week
- Google AI 2018 BERT pytorch implementationβ6,507Updated 2 years ago
- Making large AI models cheaper, faster and more accessibleβ41,291Updated last week
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --β¦β35,990Updated this week
- An annotated implementation of the Transformer paper.β6,844Updated last year