NielsRogge / Transformers-TutorialsLinks
This repository contains demos I made with the Transformers library by HuggingFace.
β11,255Updated 3 months ago
Alternatives and similar repositories for Transformers-Tutorials
Users that are interested in Transformers-Tutorials are comparing it to the libraries listed below
Sorting:
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β19,743Updated this week
- Jupyter notebooks for the Natural Language Processing with Transformers bookβ4,553Updated last year
- Train transformer language models with reinforcement learning.β15,739Updated this week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,180Updated last week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,686Updated 4 months ago
- LAVIS - A One-stop Library for Language-Vision Intelligenceβ10,936Updated 10 months ago
- Notebooks using the Hugging Face libraries π€β4,329Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,753Updated 3 months ago
- An open source implementation of CLIP.β12,702Updated 2 weeks ago
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websitesβ4,932Updated last year
- Fast and memory-efficient exact attentionβ19,778Updated this week
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.β3,161Updated 4 months ago
- A Unified Library for Parameter-Efficient and Modular Transfer Learningβ2,772Updated last month
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"β12,775Updated 9 months ago
- β11,828Updated 7 months ago
- Accessible large language models via k-bit quantization for PyTorch.β7,627Updated this week
- Ongoing research training transformer models at scaleβ13,728Updated last week
- Cleanlab's open-source library is the standard data-centric AI package for data quality and machine learning with messy, real-world data β¦β10,937Updated last month
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pyβ¦β24,080Updated last week
- A framework for few-shot evaluation of language models.β10,270Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ22,647Updated last year
- Hackable and optimized Transformers building blocks, supporting a composable construction.β9,978Updated last week
- Public repo for HF blog postsβ3,141Updated this week
- A collection of tutorials on state-of-the-art computer vision models and techniques. Explore everything from foundational architectures lβ¦β8,517Updated last week
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β14,512Updated last year
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)β9,205Updated last month
- 20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.β12,817Updated this week
- Scenic: A Jax Library for Computer Vision Research and Beyondβ3,679Updated this week
- QLoRA: Efficient Finetuning of Quantized LLMsβ10,679Updated last year
- π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal modelβ¦β150,669Updated this week