JannikSt / ibtopLinks
Real-time terminal monitor for InfiniBand networks - htop for high-speed interconnects
☆40Updated 2 weeks ago
Alternatives and similar repositories for ibtop
Users that are interested in ibtop are comparing it to the libraries listed below
Sorting:
- Modded vLLM to run pipeline parallelism over public networks☆39Updated 3 months ago
- PCCL (Prime Collective Communications Library) implements fault tolerant collective communications over IP☆120Updated this week
- SIMD quantization kernels☆87Updated last week
- A 7B parameter model for mathematical reasoning☆40Updated 7 months ago
- Training-Ready RL Environments + Evals☆90Updated this week
- Storing long contexts in tiny caches with self-study☆181Updated this week
- ☆223Updated 2 months ago
- ☆141Updated last week
- ☆99Updated last week
- Decentralized RL Training at Scale☆592Updated this week
- Simple Transformer in Jax☆139Updated last year
- Long context evaluation for large language models☆221Updated 6 months ago
- The Prime Intellect CLI provides a powerful command-line interface for managing GPU resources across various providers☆84Updated this week
- ☆21Updated 8 months ago
- Compiling useful links, papers, benchmarks, ideas, etc.☆45Updated 6 months ago
- rl from zero pretrain, can it be done? yes.☆268Updated 3 weeks ago
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆105Updated 6 months ago
- peer-to-peer compute and intelligence network that enables decentralized AI development at scale☆119Updated last month
- ☆133Updated 5 months ago
- A set of Python scripts that makes your experience on TPU better☆54Updated last year
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆65Updated 10 months ago
- Atropos is a Language Model Reinforcement Learning Environments framework for collecting and evaluating LLM trajectories through diverse …☆692Updated this week
- train entropix like a champ!☆20Updated 11 months ago
- NSA Triton Kernels written with GPT5 and Opus 4.1☆65Updated last month
- Open source interpretability artefacts for R1.☆158Updated 4 months ago
- Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*☆87Updated last year
- smolLM with Entropix sampler on pytorch☆150Updated 10 months ago
- Simple & Scalable Pretraining for Neural Architecture Research☆291Updated 3 weeks ago
- DeMo: Decoupled Momentum Optimization☆190Updated 9 months ago
- look how they massacred my boy☆64Updated 11 months ago