NielsRogge / Transformers-TutorialsLinks
This repository contains demos I made with the Transformers library by HuggingFace.
ā11,151Updated last month
Alternatives and similar repositories for Transformers-Tutorials
Users that are interested in Transformers-Tutorials are comparing it to the libraries listed below
Sorting:
- š¤ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.ā19,252Updated last week
- š A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iā¦ā9,010Updated this week
- Materials for the Hugging Face Diffusion Models Courseā4,095Updated 6 months ago
- Notebooks using the Hugging Face libraries š¤ā4,235Updated last week
- A collection of resources and papers on Diffusion Modelsā11,959Updated last year
- Jupyter notebooks for the Natural Language Processing with Transformers bookā4,468Updated 11 months ago
- Train transformer language models with reinforcement learning.ā14,989Updated this week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesā21,614Updated last month
- A concise but complete full-attention transformer with a set of promising experimental features from various papersā5,505Updated this week
- An annotated implementation of the Transformer paper.ā6,421Updated last year
- A Unified Library for Parameter-Efficient and Modular Transfer Learningā2,750Updated this week
- Fast and memory-efficient exact attentionā18,776Updated last week
- Hackable and optimized Transformers building blocks, supporting a composable construction.ā9,821Updated last week
- LAVIS - A One-stop Library for Language-Vision Intelligenceā10,817Updated 8 months ago
- Accessible large language models via k-bit quantization for PyTorch.ā7,450Updated this week
- State-of-the-Art Text Embeddingsā17,299Updated last week
- A resource for learning about Machine learning & Deep Learningā8,188Updated 11 months ago
- A playbook for systematically maximizing the performance of deep learning models.ā29,021Updated last year
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingā22,410Updated 11 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --ā¦ā35,004Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)ā7,586Updated 2 months ago
- Public repo for HF blog postsā3,070Updated this week
- š¤ Evaluate: A library for easily evaluating machine learning models and datasets.ā2,285Updated last month
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websitesā4,913Updated last year
- LLM101n: Let's build a Storytellerā34,179Updated last year
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"ā12,511Updated 7 months ago
- Foundation Architecture for (M)LLMsā3,101Updated last year
- Latest Advances on Multimodal Large Language Modelsā16,018Updated this week
- Official codebase used to develop Vision Transformer, SigLIP, MLP-Mixer, LiT and more.ā3,055Updated 2 months ago
- Cleanlab's open-source library is the standard data-centric AI package for data quality and machine learning with messy, real-world data ā¦ā10,778Updated last month