NielsRogge / Transformers-TutorialsLinks
This repository contains demos I made with the Transformers library by HuggingFace.
ā11,490Updated 3 weeks ago
Alternatives and similar repositories for Transformers-Tutorials
Users that are interested in Transformers-Tutorials are comparing it to the libraries listed below
Sorting:
- š A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iā¦ā9,486Updated this week
- š¤ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.ā20,587Updated this week
- BertViz: Visualize Attention in Transformer Modelsā7,908Updated last month
- Train transformer language models with reinforcement learning.ā17,297Updated this week
- Fast and memory-efficient exact attentionā22,113Updated this week
- An open source implementation of CLIP.ā13,324Updated 3 months ago
- Accessible large language models via k-bit quantization for PyTorch.ā7,939Updated 2 weeks ago
- LAVIS - A One-stop Library for Language-Vision Intelligenceā11,154Updated last year
- ā12,275Updated last week
- Hackable and optimized Transformers building blocks, supporting a composable construction.ā10,326Updated this week
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websitesā5,008Updated last year
- Jupyter notebooks for the Natural Language Processing with Transformers bookā4,706Updated last year
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.ā3,343Updated 8 months ago
- PyTorch code and models for the DINOv2 self-supervised learning method.ā12,370Updated last month
- Materials for the Hugging Face Diffusion Models Courseā4,267Updated 3 weeks ago
- A concise but complete full-attention transformer with a set of promising experimental features from various papersā5,795Updated last month
- Notebooks using the Hugging Face libraries š¤ā4,451Updated this week
- A Unified Library for Parameter-Efficient and Modular Transfer Learningā2,802Updated 3 months ago
- Explanation to key concepts in MLā8,265Updated 7 months ago
- š¤ Evaluate: A library for easily evaluating machine learning models and datasets.ā2,410Updated 2 weeks ago
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"ā13,219Updated last year
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.ā30,803Updated this week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesā22,002Updated 2 weeks ago
- A framework for few-shot evaluation of language models.ā11,358Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingā23,451Updated last year
- State-of-the-Art Text Embeddingsā18,192Updated last week
- Ongoing research training transformer models at scaleā15,100Updated this week
- PyTorch native post-training libraryā5,660Updated this week
- PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generationā5,658Updated last year
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --ā¦ā36,313Updated last week