google-research / tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
โ27,503Updated 5 months ago
Alternatives and similar repositories for tuning_playbook:
Users that are interested in tuning_playbook are comparing it to the libraries listed below
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)โ8,569Updated 3 weeks ago
- โ10,605Updated last week
- ๐งโ๐ซ 60+ Implementations/tutorials of deep learning papers with side-by-side notes ๐; including transformers (original, xl, switch, feeโฆโ56,997Updated 3 months ago
- An annotated implementation of the Transformer paper.โ5,788Updated 8 months ago
- ไธๆฌ็ณป็ปๅฐๆไฝ ๅฐๆทฑๅบฆๅญฆไน ๆจกๅ็ๆง่ฝๆๅคงๅ็ๆๆฏๆๅใโ2,625Updated last year
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".โ14,043Updated 4 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --โฆโ32,549Updated this week
- ๐ค PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.โ16,658Updated this week
- Fast and memory-efficient exact attentionโ14,557Updated this week
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pyโฆโ20,998Updated 2 weeks ago
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an imageโ26,369Updated 4 months ago
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websitesโ4,673Updated 4 months ago
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"โ10,899Updated 3 months ago
- ๐ A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iโฆโ8,037Updated this week
- ๐ค Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.โ136,044Updated this week
- Explanation to key concepts in MLโ7,348Updated this week
- A PyTorch implementation of the Transformer model in "Attention is All You Need".โ8,924Updated 7 months ago
- A concise but complete full-attention transformer with a set of promising experimental features from various papersโ4,874Updated this week
- Fast and Accurate ML in 3 Lines of Codeโ8,150Updated this week
- End-to-End Object Detection with Transformersโ13,722Updated 8 months ago
- A hyperparameter optimization frameworkโ11,044Updated this week
- A collection of resources and papers on Diffusion Modelsโ11,210Updated 4 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.โ35,794Updated this week
- ๐ค Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.โ26,511Updated this week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.โ15,630Updated last year
- Latex code for making neural networks diagramsโ22,335Updated last year
- LAVIS - A One-stop Library for Language-Vision Intelligenceโ10,037Updated 3 weeks ago
- Reading list for research topics in multimodal machine learningโ6,131Updated 3 months ago
- Model interpretability and understanding for PyTorchโ4,966Updated 2 weeks ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesโ20,350Updated last week