AI-Hypercomputer / kitharaLinks
☆14Updated 3 months ago
Alternatives and similar repositories for kithara
Users that are interested in kithara are comparing it to the libraries listed below
Sorting:
- torchprime is a reference model implementation for PyTorch on TPU.☆34Updated this week
- ☆15Updated 3 months ago
- ☆143Updated 2 weeks ago
- PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"☆68Updated 4 months ago
- xpk (Accelerated Processing Kit, pronounced x-p-k,) is a software tool to help Cloud developers to orchestrate training jobs on accelerat…☆138Updated this week
- JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs wel…☆369Updated 2 months ago
- ☆23Updated last week
- A JAX-native LLM Post-Training Library☆123Updated this week
- ☆19Updated 3 weeks ago
- ☆188Updated 3 weeks ago
- Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimenta…☆526Updated last week
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and sup…☆383Updated this week
- Recipes for reproducing training and serving benchmarks for large machine learning models using GPUs on Google Cloud.☆78Updated 2 weeks ago
- Experimenting with how best to do multi-host dataloading☆10Updated 2 years ago
- Fault tolerance for PyTorch (HSDP, LocalSGD, DiLoCo, Streaming DiLoCo)☆383Updated last week
- Scalable and Performant Data Loading☆291Updated this week
- Two implementations of ZeRO-1 optimizer sharding in JAX☆14Updated 2 years ago
- 🚀 Efficiently (pre)training foundation models with native PyTorch features, including FSDP for training and SDPA implementation of Flash…☆261Updated last month
- ☆16Updated 5 months ago
- ☆526Updated last year
- Pragmatic approach to parsing import profiles for CI's☆11Updated last year
- A tool to configure, launch and manage your machine learning experiments.☆183Updated this week
- Google TPU optimizations for transformers models☆118Updated 7 months ago
- NVIDIA Resiliency Extension is a python package for framework developers and users to implement fault-tolerant features. It improves the …☆206Updated last week
- Implementation of Flash Attention in Jax☆216Updated last year
- Pytorch/XLA SPMD Test code in Google TPU☆23Updated last year
- Legible, Scalable, Reproducible Foundation Models with Named Tensors and Jax☆646Updated this week
- ☆43Updated last week
- Minimal yet performant LLM examples in pure JAX☆148Updated this week
- ☆325Updated 3 weeks ago