PrunaAI / prunaLinks
Pruna is a model optimization framework built for developers, enabling you to deliver faster, more efficient models with minimal overhead.
β1,045Updated last week
Alternatives and similar repositories for pruna
Users that are interested in pruna are comparing it to the libraries listed below
Sorting:
- A lightweight, local-first, and π experiment tracking library from Hugging Face π€β1,140Updated this week
- A curated list of materials on AI efficiencyβ195Updated last month
- Speed up model training by fixing data loading.β565Updated this week
- An interface library for RL post training with environments.β829Updated this week
- dLLM: Simple Diffusion Language Modelingβ1,261Updated last week
- Where GPUs get cooked π©βπ³π₯β326Updated 2 months ago
- β532Updated 4 months ago
- A Lossless Compression Library for AI pipelinesβ289Updated 5 months ago
- PyTorch compiler that accelerates training and inference. Get built-in optimizations for performance, memory, parallelism, and easily wriβ¦β1,429Updated this week
- Tool for generating high quality Synthetic datasetsβ1,420Updated last month
- Best practices & guides on how to write distributed pytorch training codeβ552Updated last month
- Fast State-of-the-Art Static Embeddingsβ1,948Updated last month
- β213Updated this week
- Next Generation Experimental Tracking for Machine Learning Operationsβ359Updated 6 months ago
- π€ Benchmark Large Language Models Reliably On Your Dataβ418Updated this week
- Courses on building, compressing, evaluating, and deploying efficient AI models.β62Updated last month
- β° AI conference deadline countdownsβ291Updated last week
- An implementation of PSGD Kron second-order optimizer for PyTorchβ97Updated 4 months ago
- TabBench is a benchmark built to evaluate machine learning models on tabular data, focusing on real-world industry use cases.β107Updated 2 months ago
- Simple UI for debugging correlations of text embeddingsβ302Updated 6 months ago
- Dion optimizer algorithmβ403Updated last week
- Official repository for our work on micro-budget training of large-scale diffusion models.β1,540Updated 11 months ago
- Hypernetworks that adapt LLMs for specific benchmark tasks using only textual task description as the inputβ927Updated 6 months ago
- A general library for generating high-quality synthetic data from scratch or based on your own seed data.β403Updated last week
- Scalable and Performant Data Loadingβ352Updated this week
- A minimalistic framework for transparently training language models and storing comprehensive checkpoints for in-depth learning dynamics β¦β294Updated 2 weeks ago
- Build datasets using natural languageβ551Updated 2 months ago
- Multi-backend recommender systems with Keras 3β149Updated this week
- β692Updated 7 months ago
- Inference, Fine Tuning and many more recipes with Gemma family of modelsβ276Updated 4 months ago