google-research / tuning_playbookLinks
A playbook for systematically maximizing the performance of deep learning models.
☆28,899Updated last year
Alternatives and similar repositories for tuning_playbook
Users that are interested in tuning_playbook are comparing it to the libraries listed below
Sorting:
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,441Updated last week
- 一本系统地教你将深度学习模型的性能最大化的战术手册。☆2,921Updated 2 years ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,019Updated last week
- 🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.☆15,414Updated 2 years ago
- 🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, fee…☆61,778Updated 10 months ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆34,654Updated this week
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,767Updated this week
- Explanation to key concepts in ML☆7,974Updated last week
- A collection of resources and papers on Diffusion Models☆11,882Updated 11 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆42,646Updated 7 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,239Updated 10 months ago
- ☆11,545Updated 4 months ago
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,224Updated 6 months ago
- Train transformer language models with reinforcement learning.☆14,513Updated this week
- The official GitHub page for the survey paper "A Survey of Large Language Models".☆11,648Updated 4 months ago
- Fast and memory-efficient exact attention☆18,252Updated this week
- An annotated implementation of the Transformer paper.☆6,338Updated last year
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆18,976Updated this week
- A curated list of practical guide resources of LLMs (LLMs Tree, Examples, Papers)☆9,975Updated last year
- The AI developer platform. Use Weights & Biases to train and fine-tune models, and manage models from experimentation to production.☆10,082Updated this week
- Graph Neural Network Library for PyTorch☆22,563Updated last week
- This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".☆14,979Updated 11 months ago
- Understanding Deep Learning - Simon J.D. Prince☆7,631Updated this week
- This repository contains demos I made with the Transformers library by HuggingFace.☆11,055Updated last week
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆23,314Updated 4 months ago
- 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal model…☆146,733Updated this week
- Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including …☆26,239Updated 10 months ago
- The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoi…☆50,759Updated 9 months ago
- LAVIS - A One-stop Library for Language-Vision Intelligence☆10,719Updated 7 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆39,299Updated this week