huggingface / transformersLinks
π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
β149,853Updated this week
Alternatives and similar repositories for transformers
Users that are interested in transformers are comparing it to the libraries listed below
Sorting:
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β31,810Updated last week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β40,058Updated last week
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.β30,113Updated this week
- TensorFlow code and pre-trained models for BERTβ39,509Updated last year
- Unsupervised text tokenizer for Neural Network-based text generation.β11,270Updated this week
- State-of-the-Art Text Embeddingsβ17,513Updated 2 weeks ago
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ10,060Updated 2 weeks ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,713Updated 2 months ago
- Graph Neural Network Library for PyTorchβ22,873Updated last week
- π€ The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation toolsβ20,647Updated this week
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β19,570Updated last week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,133Updated this week
- Ongoing research training transformer models at scaleβ13,602Updated this week
- An open-source NLP research library, built on PyTorch.β11,878Updated 2 years ago
- Tensors and Dynamic neural networks in Python with strong GPU accelerationβ93,116Updated this week
- A library for efficient similarity search and clustering of dense vectors.β37,076Updated last week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β16,466Updated 2 years ago
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ33,428Updated this week
- Google Researchβ36,381Updated this week
- The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoiβ¦β51,821Updated 11 months ago
- Build and share delightful machine learning apps, all in Python. π Star to support our work!β39,906Updated this week
- Google AI 2018 BERT pytorch implementationβ6,447Updated 2 years ago
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,639Updated 3 months ago
- Open standard for machine learning interoperabilityβ19,582Updated last week
- Natural Language Processing Tutorial for Deep Learning Researchersβ14,725Updated last year
- A PyTorch implementation of the Transformer model in "Attention is All You Need".β9,368Updated last year
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --β¦β35,250Updated this week
- Streamlit β A faster way to build and share data apps.β41,384Updated this week
- π Scalable embedding, reasoning, ranking for images and sentences with CLIPβ12,743Updated last year
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".β15,185Updated last year