google-research / tuning_playbookLinks
A playbook for systematically maximizing the performance of deep learning models.
☆28,843Updated last year
Alternatives and similar repositories for tuning_playbook
Users that are interested in tuning_playbook are comparing it to the libraries listed below
Sorting:
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆23,145Updated 3 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆34,507Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,991Updated last month
- 🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.☆15,281Updated 2 years ago
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,644Updated this week
- Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, I…☆11,783Updated 2 months ago
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".☆14,911Updated 10 months ago
- 🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.…☆11,974Updated 6 months ago
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆18,820Updated this week
- ☆11,478Updated 3 months ago
- 一本系统地教你将深度学习模型的性能最大化的战术手册。☆2,893Updated 2 years ago
- Graph Neural Network Library for PyTorch☆22,497Updated this week
- Fast and Accurate ML in 3 Lines of Code☆8,953Updated this week
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,396Updated this week
- A hyperparameter optimization framework☆12,136Updated this week
- Explanation to key concepts in ML☆7,617Updated this week
- Fast and memory-efficient exact attention☆17,952Updated this week
- Fast and flexible image augmentation library. Paper about the library: https://www.mdpi.com/2078-2489/11/2/125☆14,995Updated this week
- Mamba SSM architecture☆15,095Updated 3 weeks ago
- End-to-End Object Detection with Transformers☆14,430Updated last year
- A Collection of Variational Autoencoders (VAE) in PyTorch.☆7,201Updated 3 months ago
- A collection of resources and papers on Diffusion Models☆11,812Updated 10 months ago
- 🐍 Geometric Computer Vision Library for Spatial AI☆10,537Updated last week
- PyTorch code and models for the DINOv2 self-supervised learning method.☆10,845Updated last week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,108Updated 10 months ago
- An annotated implementation of the Transformer paper.☆6,296Updated last year
- [NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.☆22,843Updated 10 months ago
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆9,610Updated this week
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,090Updated 6 months ago
- Python package built to ease deep learning on graph, on top of existing DL frameworks.☆13,921Updated 4 months ago