ActNN: Reducing Training Memory Footprint via 2-Bit Activation Compressed Training
☆198Dec 22, 2022Updated 3 years ago
Alternatives and similar repositories for actnn
Users that are interested in actnn are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- ☆44Nov 1, 2022Updated 3 years ago
- code for the paper "A Statistical Framework for Low-bitwidth Training of Deep Neural Networks"☆29Oct 31, 2020Updated 5 years ago
- Improving Post Training Neural Quantization: Layer-wise Calibration and Integer Programming☆98Jun 10, 2021Updated 4 years ago
- ☆43Jan 30, 2024Updated 2 years ago
- MONeT framework for reducing memory consumption of DNN training☆174May 4, 2021Updated 4 years ago
- Deploy open-source AI quickly and easily - Special Bonus Offer • AdRunpod Hub is built for open source. One-click deployment and autoscaling endpoints without provisioning your own infrastructure.
- ☆58Dec 8, 2020Updated 5 years ago
- ☆42Sep 8, 2023Updated 2 years ago
- source code of the paper: Robust Quantization: One Model to Rule Them All☆40Mar 24, 2023Updated 3 years ago
- Codes for Accepted Paper : "MetaQuant: Learning to Quantize by Learning to Penetrate Non-differentiable Quantization" in NeurIPS 2019☆54May 8, 2020Updated 5 years ago
- [ICML 2023] This project is the official implementation of our accepted ICML 2023 paper BiBench: Benchmarking and Analyzing Network Binar…☆56Mar 4, 2024Updated 2 years ago
- An external memory allocator example for PyTorch.