huggingface / transformers
π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
β142,871Updated this week
Alternatives and similar repositories for transformers:
Users that are interested in transformers are comparing it to the libraries listed below
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β31,297Updated 3 months ago
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.β29,263Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,052Updated last month
- Tensors and Dynamic neural networks in Python with strong GPU accelerationβ88,834Updated this week
- π€ The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation toolsβ19,969Updated this week
- TensorFlow code and pre-trained models for BERTβ39,000Updated 8 months ago
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ9,572Updated 3 weeks ago
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β14,146Updated 8 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β37,834Updated this week
- Code for the paper "Language Models are Unsupervised Multitask Learners"β23,295Updated 8 months ago
- Build and share delightful machine learning apps, all in Python. π Star to support our work!β37,426Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.β10,786Updated last week
- Visualizer for neural network, deep learning and machine learning modelsβ29,893Updated this week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β8,608Updated this week
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ31,898Updated this week
- Google Researchβ35,347Updated this week
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β18,082Updated this week
- Train transformer language models with reinforcement learning.β13,166Updated this week
- A library for efficient similarity search and clustering of dense vectors.β34,255Updated this week
- An Open Source Machine Learning Framework for Everyoneβ189,339Updated this week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β16,015Updated last year
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,306Updated last year
- Inference code for Llama modelsβ58,078Updated 2 months ago
- Fast and memory-efficient exact attentionβ16,835Updated this week
- π€ Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.β28,543Updated this week
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"β6,325Updated last month
- A PyTorch implementation of the Transformer model in "Attention is All You Need".β9,128Updated 11 months ago
- An annotated implementation of the Transformer paper.β6,157Updated last year
- π¦π Build context-aware reasoning applicationsβ105,331Updated this week
- Making large AI models cheaper, faster and more accessibleβ40,766Updated this week