huggingface / transformersLinks
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
☆155,967Updated last week
Alternatives and similar repositories for transformers
Users that are interested in transformers are comparing it to the libraries listed below
Sorting:
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,125Updated 4 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,509Updated this week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆22,002Updated 2 weeks ago
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆30,803Updated this week
- TensorFlow code and pre-trained models for BERT☆39,834Updated last year
- Google Research☆37,192Updated last week
- 🤗 The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation tools☆21,174Updated this week
- Train transformer language models with reinforcement learning.☆17,297Updated this week
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆20,587Updated this week
- Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the mo…☆22,978Updated last year
- BertViz: Visualize Attention in Transformer Models☆7,896Updated 3 weeks ago
- Unsupervised text tokenizer for Neural Network-based text generation.☆11,616Updated 2 weeks ago
- Fast and memory-efficient exact attention☆22,113Updated this week
- A library for efficient similarity search and clustering of dense vectors.☆38,999Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,477Updated last week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆16,969Updated 2 years ago
- State-of-the-Art Text Embeddings☆18,192Updated last week
- Ongoing research training transformer models at scale☆15,100Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆23,451Updated last year
- 🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch.☆32,703Updated this week
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆10,445Updated this week
- Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!☆41,496Updated last week
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆36,313Updated last week
- A high-throughput and memory-efficient inference and serving engine for LLMs☆69,622Updated this week
- Natural Language Processing Tutorial for Deep Learning Researchers☆14,844Updated last year
- [NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.☆24,426Updated last year
- An annotated implementation of the Transformer paper.☆6,973Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆24,585Updated last year
- Tensors and Dynamic neural networks in Python with strong GPU acceleration☆97,130Updated this week
- An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.☆39,392Updated 8 months ago