amjadmajid / BabyTorchLinks
BabyTorch is a minimalist deep-learning framework with a similar API to PyTorch. This minimalist design encourages learners explore and understand the underlying algorithms and mechanics of deep learning processes. It is design such that when learners are ready to switch to PyTorch they only need to remove the word `baby`.
☆26Updated 2 months ago
Alternatives and similar repositories for BabyTorch
Users that are interested in BabyTorch are comparing it to the libraries listed below
Sorting:
- Cost aware hyperparameter tuning algorithm☆169Updated last year
- Latent Program Network (from the "Searching Latent Program Spaces" paper)☆93Updated 5 months ago
- Minimal yet performant LLM examples in pure JAX☆150Updated last week
- ☆115Updated 2 months ago
- Implementation of Diffusion Transformer (DiT) in JAX☆292Updated last year
- ☆275Updated last year
- ☆150Updated last year
- Minimal but scalable implementation of large language models in JAX☆35Updated this week
- 🧱 Modula software package☆231Updated 2 weeks ago
- Synchronized Curriculum Learning for RL Agents☆112Updated last week
- ☆101Updated 2 weeks ago
- Official JAX implementation of xLSTM including fast and efficient training and inference code. 7B model available at https://huggingface.…☆101Updated 7 months ago
- fast + parallel AlphaZero in JAX☆96Updated 8 months ago
- Solve puzzles. Learn CUDA.☆64Updated last year
- Accelerated minigrid environments with JAX☆144Updated 2 weeks ago
- A simple library for scaling up JAX programs☆143Updated 10 months ago
- Pytorch implementation of Evolutionary Policy Optimization, from Wang et al. of the Robotics Institute at Carnegie Mellon University☆97Updated last month
- CIFAR-10 speedruns: 94% in 2.6 seconds and 96% in 27 seconds☆288Updated last month
- seqax = sequence modeling + JAX☆166Updated last month
- Custom triton kernels for training Karpathy's nanoGPT.☆19Updated 10 months ago
- Maximal Update Parametrization (μP) with Flax & Optax.☆16Updated last year
- Minimal, lightweight JAX implementations of popular models.☆96Updated last week
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆36Updated last year
- (Crafter + NetHack) in JAX. ICML 2024 Spotlight.☆331Updated last month
- Running Jax in PyTorch Lightning☆113Updated 8 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆156Updated 2 months ago
- Efficient optimizers☆259Updated last month
- Jax/Flax rewrite of Karpathy's nanoGPT☆59Updated 2 years ago
- Flax (Jax) implementation of DeepSeek-R1-Distill-Qwen-1.5B with weights ported from Hugging Face.☆22Updated 6 months ago
- ☆450Updated 10 months ago