NielsRogge / Transformers-TutorialsLinks
This repository contains demos I made with the Transformers library by HuggingFace.
☆11,006Updated last month
Alternatives and similar repositories for Transformers-Tutorials
Users that are interested in Transformers-Tutorials are comparing it to the libraries listed below
Sorting:
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)☆7,491Updated 3 weeks ago
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆18,861Updated this week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆8,875Updated this week
- Fast and memory-efficient exact attention☆18,043Updated this week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,441Updated 3 weeks ago
- Notebooks using the Hugging Face libraries 🤗☆4,168Updated this week
- Train transformer language models with reinforcement learning.☆14,366Updated this week
- State-of-the-Art Text Embeddings☆16,984Updated 2 weeks ago
- ☆11,491Updated 3 months ago
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆23,208Updated 3 months ago
- Cleanlab's open-source library is the standard data-centric AI package for data quality and machine learning with messy, real-world data …☆10,645Updated 3 weeks ago
- A collection of tutorials on state-of-the-art computer vision models and techniques. Explore everything from foundational architectures l…☆7,861Updated 3 weeks ago
- LAVIS - A One-stop Library for Language-Vision Intelligence☆10,683Updated 7 months ago
- Ongoing research training transformer models at scale☆12,641Updated last week
- An annotated implementation of the Transformer paper.☆6,313Updated last year
- Accessible large language models via k-bit quantization for PyTorch.☆7,167Updated this week
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,405Updated this week
- Model interpretability and understanding for PyTorch☆5,278Updated this week
- 🐍 Geometric Computer Vision Library for Spatial AI☆10,561Updated this week
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.☆2,964Updated last month
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,153Updated 6 months ago
- Jupyter notebooks for the Natural Language Processing with Transformers book☆4,389Updated 10 months ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,001Updated this week
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆34,559Updated this week
- PyTorch code for Vision Transformers training with the Self-Supervised learning method DINO☆6,921Updated 11 months ago
- A Unified Library for Parameter-Efficient and Modular Transfer Learning☆2,721Updated last month
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,142Updated 10 months ago
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,665Updated this week
- Code release for ConvNeXt model☆6,043Updated 2 years ago
- Awesome-LLM: a curated list of Large Language Model☆23,985Updated last month