google-research / tuning_playbook
A playbook for systematically maximizing the performance of deep learning models.
☆28,656Updated 10 months ago
Alternatives and similar repositories for tuning_playbook:
Users that are interested in tuning_playbook are comparing it to the libraries listed below
- 一本系统地教你将深度学习模型的性能最大化的战术手册。☆2,850Updated last year
- 🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, fee…☆60,456Updated 8 months ago
- This repository contains demos I made with the Transformers library by HuggingFace.☆10,821Updated 2 weeks ago
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆22,755Updated 2 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆34,049Updated this week
- A collection of resources and papers on Diffusion Models☆11,696Updated 9 months ago
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,414Updated this week
- Fast and memory-efficient exact attention☆17,259Updated last week
- ☆11,326Updated 2 months ago
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆9,457Updated last week
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,193Updated 2 months ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,892Updated last week
- Implementation of Denoising Diffusion Probabilistic Model in Pytorch☆9,297Updated 6 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆38,206Updated this week
- PyTorch code and models for the DINOv2 self-supervised learning method.☆10,510Updated 9 months ago
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,288Updated 2 weeks ago
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆8,673Updated this week
- 🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.☆14,923Updated 2 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆143,804Updated this week
- RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable)…☆13,576Updated this week
- Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including …☆25,741Updated 8 months ago
- 🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.☆28,835Updated this week
- An annotated implementation of the Transformer paper.☆6,199Updated last year
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".☆14,729Updated 9 months ago
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆18,320Updated this week
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image☆28,780Updated 9 months ago
- The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoi…☆49,997Updated 7 months ago
- The official GitHub page for the survey paper "A Survey of Large Language Models".☆11,446Updated last month
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more☆32,148Updated this week
- LAVIS - A One-stop Library for Language-Vision Intelligence☆10,528Updated 5 months ago