google-research / tuning_playbookLinks
A playbook for systematically maximizing the performance of deep learning models.
☆29,355Updated last year
Alternatives and similar repositories for tuning_playbook
Users that are interested in tuning_playbook are comparing it to the libraries listed below
Sorting:
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,817Updated 4 months ago
- 一本系统地教你将深度学习模型的性能最大化的战术手册。☆3,076Updated 2 years ago
- 🎨 ML Visuals contains figures and templates which you can reuse and customize to improve your scientific writing.☆15,999Updated 2 years ago
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆30,382Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆9,264Updated 3 months ago
- 🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, fee…☆64,237Updated this week
- An annotated implementation of the Transformer paper.☆6,691Updated last year
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,676Updated this week
- Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Py…☆24,374Updated 2 weeks ago
- The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights --…☆35,688Updated last week
- Mamba SSM architecture☆16,379Updated last month
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆20,002Updated last week
- ☆11,989Updated 8 months ago
- The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoi…☆52,429Updated last year
- An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites☆4,960Updated last year
- PyTorch code and models for the DINOv2 self-supervised learning method.☆11,847Updated 2 months ago
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)☆7,745Updated 5 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,890Updated last year
- View model summaries in PyTorch!☆2,878Updated last week
- Kolmogorov Arnold Networks☆15,975Updated 9 months ago
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"☆12,915Updated 10 months ago
- 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (i…☆9,277Updated last week
- A hyperparameter optimization framework☆13,005Updated this week
- Transformer: PyTorch Implementation of "Attention Is All You Need"☆4,211Updated 3 months ago
- The AI developer platform. Use Weights & Biases to train and fine-tune models, and manage models from experimentation to production.☆10,534Updated this week
- Train transformer language models with reinforcement learning.☆16,228Updated this week
- Explanation to key concepts in ML☆8,114Updated 4 months ago
- OpenMMLab Detection Toolbox and Benchmark☆31,974Updated last year
- 🐍 Geometric Computer Vision Library for Spatial AI☆10,834Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆40,670Updated this week