meta-pytorch / superblockLinks
A block oriented training approach for inference time optimization.
☆34Updated last year
Alternatives and similar repositories for superblock
Users that are interested in superblock are comparing it to the libraries listed below
Sorting:
- This repository contains the experimental PyTorch native float8 training UX☆223Updated last year
- ☆158Updated 2 years ago
- Quantize transformers to any learned arbitrary 4-bit numeric format☆48Updated 3 months ago
- Memory Optimizations for Deep Learning (ICML 2023)☆108Updated last year
- A library for unit scaling in PyTorch☆130Updated 3 months ago
- Experiment of using Tangent to autodiff triton☆80Updated last year
- Official implementation for Training LLMs with MXFP4☆96Updated 5 months ago
- Faster Pytorch bitsandbytes 4bit fp4 nn.Linear ops☆29Updated last year
- Flash-Muon: An Efficient Implementation of Muon Optimizer☆193Updated 3 months ago
- Work in progress.☆74Updated 3 months ago
- QUICK: Quantization-aware Interleaving and Conflict-free Kernel for efficient LLM inference☆118Updated last year
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆45Updated last year
- extensible collectives library in triton☆89Updated 6 months ago
- ☆99Updated 4 months ago
- The evaluation framework for training-free sparse attention in LLMs☆101Updated 3 months ago
- Patch convolution to avoid large GPU memory usage of Conv2D☆92Updated 8 months ago
- Fast Hadamard transform in CUDA, with a PyTorch interface☆246Updated this week
- ☆161Updated 2 years ago
- FlashRNN - Fast RNN Kernels with I/O Awareness☆98Updated 4 months ago
- ☆113Updated last year
- ☆156Updated 2 years ago
- A bunch of kernels that might make stuff slower 😉☆59Updated this week
- ☆100Updated last month
- Tritonbench is a collection of PyTorch custom operators with example inputs to measure their performance.☆252Updated this week
- Fast low-bit matmul kernels in Triton☆379Updated 2 weeks ago
- ☆82Updated 8 months ago
- ☆122Updated last year
- ☆57Updated last year
- ring-attention experiments☆152Updated 11 months ago
- ☆251Updated 4 months ago