learning-at-home / hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
☆2,099Updated last month
Alternatives and similar repositories for hivemind:
Users that are interested in hivemind are comparing it to the libraries listed below
- Your PyTorch AI Factory - Flash enables you to easily configure and run complex AI recipes for over 15 tasks across 7 data domains☆1,744Updated last year
- Deploy a ML inference service on a budget in less than 10 lines of code.☆1,344Updated 11 months ago
- Version control for machine learning☆1,654Updated this week
- AITemplate is a Python framework which renders neural network into high performance CUDA/HIP C++ code. Specialized for FP16 TensorCore (N…☆4,590Updated last month
- Python bindings for the Transformer models implemented in C/C++ using GGML library.☆1,831Updated last year
- Hummingbird compiles trained ML models into tensor computation for faster inference.☆3,379Updated last week
- Sparsity-aware deep learning inference runtime for CPUs☆3,088Updated 6 months ago
- 🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading☆9,385Updated 4 months ago
- Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models☆2,096Updated 5 months ago
- Accessible large language models via k-bit quantization for PyTorch.☆6,573Updated this week
- ☆1,528Updated last year
- MADGRAD Optimization Method☆803Updated this week
- ♾️ CML - Continuous Machine Learning | CI/CD for ML☆4,061Updated last week
- Make PyTorch models up to 40% faster! Thunder is a source to source compiler for PyTorch. It enables using different hardware executors a…☆1,269Updated this week
- ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling …☆5,795Updated this week
- Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs☆2,323Updated this week
- 🤖 A PyTorch library of curated Transformer models and their composable components☆874Updated 9 months ago
- MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.☆1,950Updated last week
- Flax is a neural network library for JAX that is designed for flexibility.☆6,297Updated this week
- 🚀 Accelerate inference and training of 🤗 Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimization…☆2,708Updated this week
- The WeightWatcher tool for predicting the accuracy of Deep Neural Networks☆1,526Updated 4 months ago
- Aim 💫 — An easy-to-use & supercharged open-source experiment tracker.☆5,330Updated this week
- Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate …☆629Updated last year
- PyTorch extensions for high performance and large scale training.☆3,238Updated 2 weeks ago
- Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, …☆3,087Updated this week
- Mesh TensorFlow: Model Parallelism Made Easier☆1,600Updated last year
- A uniform interface to run deep learning models from multiple frameworks☆936Updated last year
- maximal update parametrization (µP)☆1,441Updated 6 months ago
- Official code for "Distributed Deep Learning in Open Collaborations" (NeurIPS 2021)☆116Updated 3 years ago
- OpenDiLoCo: An Open-Source Framework for Globally Distributed Low-Communication Training☆426Updated 2 weeks ago