huggingface / transformersLinks
π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
β145,689Updated this week
Alternatives and similar repositories for transformers
Users that are interested in transformers are comparing it to the libraries listed below
Sorting:
- π€ The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation toolsβ20,270Updated this week
- TensorFlow code and pre-trained models for BERTβ39,226Updated 10 months ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β31,523Updated last week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β38,833Updated this week
- State-of-the-Art Text Embeddingsβ16,898Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,391Updated 2 weeks ago
- Google Researchβ35,790Updated this week
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.β29,612Updated this week
- Code for the paper "Language Models are Unsupervised Multitask Learners"β23,665Updated 10 months ago
- A library for efficient similarity search and clustering of dense vectors.β35,530Updated this week
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Autoβ¦β14,800Updated this week
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β14,329Updated 10 months ago
- Making large AI models cheaper, faster and more accessibleβ40,966Updated this week
- Inference code for Llama modelsβ58,368Updated 4 months ago
- Visualizer for neural network, deep learning and machine learning modelsβ30,450Updated this week
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an imageβ29,407Updated 10 months ago
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ9,796Updated last week
- Unsupervised text tokenizer for Neural Network-based text generation.β10,994Updated 2 months ago
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β8,839Updated this week
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ32,533Updated this week
- Ongoing research training transformer models at scaleβ12,564Updated this week
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --β¦β34,444Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,473Updated 2 weeks ago
- Build and share delightful machine learning apps, all in Python. π Star to support our work!β38,620Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ22,073Updated 10 months ago
- LlamaIndex is the leading framework for building LLM-powered agents over your data.β42,374Updated this week
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β18,774Updated last week
- Fast and memory-efficient exact attentionβ17,846Updated this week
- Train transformer language models with reinforcement learning.β14,193Updated this week
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"β12,090Updated 6 months ago