NielsRogge / Transformers-TutorialsLinks
This repository contains demos I made with the Transformers library by HuggingFace.
☆11,342Updated 4 months ago
Alternatives and similar repositories for Transformers-Tutorials
Users that are interested in Transformers-Tutorials are comparing it to the libraries listed below
Sorting:
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,289Updated this week
- A playbook for systematically maximizing the performance of deep learning models.☆29,395Updated last year
- Notebooks using the Hugging Face libraries 🤗☆4,374Updated this week
- An annotated implementation of the Transformer paper.☆6,691Updated last year
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,915Updated 11 months ago
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)☆7,745Updated 5 months ago
- Fast and memory-efficient exact attention☆20,541Updated this week
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆20,050Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,269Updated 3 months ago
- Train transformer language models with reinforcement learning.☆16,308Updated this week
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,676Updated last week
- An open source implementation of CLIP.☆12,916Updated last week
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆35,773Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,831Updated 4 months ago
- LAVIS - A One-stop Library for Language-Vision Intelligence☆11,020Updated 11 months ago
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites☆4,967Updated last year
- Accessible large language models via k-bit quantization for PyTorch.☆7,737Updated last week
- PyTorch code and models for the DINOv2 self-supervised learning method.☆11,847Updated 3 months ago
- A collection of resources and papers on Diffusion Models☆12,134Updated last year
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆24,435Updated last week
- Materials for the Hugging Face Diffusion Models Course☆4,186Updated 9 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,917Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆49,543Updated this week
- Jupyter notebooks for the Natural Language Processing with Transformers book☆4,628Updated last year
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image☆31,564Updated last year
- Transformer: PyTorch Implementation of "Attention Is All You Need"☆4,240Updated 4 months ago
- Explanation to key concepts in ML☆8,114Updated 4 months ago
- [NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.☆23,957Updated last year
- Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.☆4,211Updated 3 weeks ago
- State-of-the-Art Text Embeddings☆17,849Updated 3 weeks ago