google-research / tuning_playbookLinks
A playbook for systematically maximizing the performance of deep learning models.
☆29,203Updated last year
Alternatives and similar repositories for tuning_playbook
Users that are interested in tuning_playbook are comparing it to the libraries listed below
Sorting:
- 一本系统地教你将深度学习模型的性能最大化的战术手册。☆3,016Updated 2 years ago
- 🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.☆15,806Updated 2 years ago
- Fast and memory-efficient exact attention☆19,693Updated last week
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆19,688Updated last week
- 🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, fee…☆63,383Updated 2 weeks ago
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆24,080Updated last week
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,165Updated last week
- This repository contains demos I made with the Transformers library by HuggingFace.☆11,255Updated 3 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆35,398Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,205Updated last month
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,597Updated this week
- ☆11,811Updated 6 months ago
- Code examples in pyTorch and Tensorflow for CS230☆4,107Updated 2 years ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,753Updated 3 months ago
- Fast and flexible image augmentation library. Paper about the library: https://www.mdpi.com/2078-2489/11/2/125☆15,161Updated 3 months ago
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".☆15,249Updated last year
- An annotated implementation of the Transformer paper.☆6,579Updated last year
- Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, I…☆12,209Updated 5 months ago
- Mamba SSM architecture☆15,956Updated this week
- A collection of resources and papers on Diffusion Models☆12,053Updated last year
- Must-read papers on graph neural networks (GNN)☆16,588Updated last year
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites☆4,930Updated last year
- PyTorch Tutorial for Deep Learning Researchers☆31,769Updated 2 years ago
- Explanation to key concepts in ML☆8,092Updated 3 months ago
- Model interpretability and understanding for PyTorch☆5,419Updated this week
- A PyTorch implementation of the Transformer model in "Attention is All You Need".☆9,402Updated last year
- PyTorch code and models for the DINOv2 self-supervised learning method.☆11,638Updated last month
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,742Updated 9 months ago
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆30,194Updated this week
- 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal model…☆150,386Updated this week