NVIDIA / NeMo-Run
A tool to configure, launch and manage your machine learning experiments.
β62Updated this week
Related projects β
Alternatives and complementary repositories for NeMo-Run
- π Efficiently (pre)training foundation models with native PyTorch features, including FSDP for training and SDPA implementation of Flashβ¦β191Updated 3 weeks ago
- some common Huggingface transformers in maximal update parametrization (Β΅P)β76Updated 2 years ago
- β76Updated 5 months ago
- This repository contains the experimental PyTorch native float8 training UXβ211Updated 3 months ago
- π Collection of components for development, training, tuning, and inference of foundation models leveraging PyTorch native components.β163Updated this week
- Google TPU optimizations for transformers models