skolai / fewbitLinks
Compression schema for gradients of activations in backward pass
☆44Updated 2 years ago
Alternatives and similar repositories for fewbit
Users that are interested in fewbit are comparing it to the libraries listed below
Sorting:
- Learning to Initialize Neural Networks for Stable and Efficient Training☆138Updated 3 years ago
- PyTorch implementation of L2L execution algorithm☆109Updated 3 years ago
- ☆59Updated 5 years ago
- ☆29Updated 3 years ago
- ☆222Updated 2 years ago
- ☆124Updated last year
- sigma-MoE layer☆21Updated 2 years ago
- This is the repo for DenseAttention and DANet - fast and conceptually simple modification of standard attention and Transformer☆19Updated 3 weeks ago
- ☆160Updated 2 years ago
- Experiment of using Tangent to autodiff triton☆81Updated 2 years ago
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆46Updated last year
- Customized matrix multiplication kernels☆57Updated 3 years ago
- Easy-to-use AdaHessian optimizer (PyTorch)☆79Updated 5 years ago
- A library for unit scaling in PyTorch☆133Updated 6 months ago
- ☆71Updated last year
- ☆36Updated last year
- Official code for "Distributed Deep Learning in Open Collaborations" (NeurIPS 2021)☆117Updated 4 years ago
- ☆20Updated last year
- Block-sparse primitives for PyTorch☆160Updated 4 years ago
- Memory Efficient Attention (O(sqrt(n)) for Jax and PyTorch☆184Updated 3 years ago
- Implementation of fused cosine similarity attention in the same style as Flash Attention☆220Updated 2 years ago
- Latest Weight Averaging (NeurIPS HITY 2022)☆32Updated 2 years ago
- MUSCO: MUlti-Stage COmpression of neural networks☆72Updated 4 years ago
- ☆10Updated 3 years ago
- PyTorch implementation of HashedNets☆38Updated 2 years ago
- Official implementation of the paper "You Do Not Fully Utilize Transformer's Representation Capacity"☆31Updated 8 months ago
- A block oriented training approach for inference time optimization.☆34Updated last year
- ☆21Updated 9 months ago
- ☆36Updated 2 years ago
- Implementation of a Transformer, but completely in Triton☆278Updated 3 years ago