neuralmagic / sparseml
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
☆2,124Updated 8 months ago
Alternatives and similar repositories for sparseml:
Users that are interested in sparseml are comparing it to the libraries listed below
- Sparsity-aware deep learning inference runtime for CPUs☆3,133Updated 9 months ago
- Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes☆382Updated 9 months ago
- Top-level directory for documentation and general content☆120Updated 4 months ago
- ML model optimization product to accelerate inference.☆326Updated last year
- A Python-level JIT compiler designed to make unmodified PyTorch programs faster.☆1,040Updated last year
- Efficient, scalable and enterprise-grade CPU/GPU inference server for 🤗 Hugging Face transformer models 🚀☆1,684Updated 5 months ago
- Kernl lets you run PyTorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackab…☆1,565Updated last year
- FFCV: Fast Forward Computer Vision (and other ML workloads!)☆2,917Updated 10 months ago
- Library for 8-bit optimizers and quantization routines.☆716Updated 2 years ago
- PyTorch/TorchScript/FX compiler for NVIDIA GPUs using TensorRT☆2,725Updated this week
- Your PyTorch AI Factory - Flash enables you to easily configure and run complex AI recipes for over 15 tasks across 7 data domains☆1,741Updated last year
- The WeightWatcher tool for predicting the accuracy of Deep Neural Networks☆1,572Updated 7 months ago
- Cramming the training of a (BERT-type) language model into limited compute.☆1,329Updated 10 months ago
- Hummingbird compiles trained ML models into tensor computation for faster inference.☆3,432Updated 3 months ago
- PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.☆786Updated 2 months ago
- MADGRAD Optimization Method☆803Updated 2 months ago
- PyTorch extensions for high performance and large scale training.☆3,298Updated last week
- functorch is JAX-like composable function transforms for PyTorch.☆1,422Updated this week
- Neural Network Compression Framework for enhanced OpenVINO™ inference☆998Updated this week
- SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX R…☆2,376Updated this week
- maximal update parametrization (µP)☆1,497Updated 9 months ago
- Thunder gives you PyTorch models superpowers for training and inference. Unlock out-of-the-box optimizations for performance, memory and …☆1,325Updated this week
- Accelerate your Neural Architecture Search (NAS) through fast, reproducible and modular research.☆475Updated 5 months ago
- Tensors, for human consumption☆1,214Updated 5 months ago
- The merlin dataloader lets you rapidly load tabular data for training deep leaning models with TensorFlow, PyTorch or JAX☆418Updated last year
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,719Updated 2 weeks ago
- torch-optimizer -- collection of optimizers for Pytorch☆3,103Updated last year
- TorchBench is a collection of open source benchmarks used to evaluate PyTorch performance.☆933Updated this week
- A PyTorch repo for data loading and utilities to be shared by the PyTorch domain libraries.☆1,187Updated this week
- PyTorch native quantization and sparsity for training and inference☆1,974Updated this week