dvruette / barrel-rec-pytorch
☆53Updated 11 months ago
Alternatives and similar repositories for barrel-rec-pytorch:
Users that are interested in barrel-rec-pytorch are comparing it to the libraries listed below
- supporting pytorch FSDP for optimizers☆75Updated last month
- ☆75Updated 6 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆121Updated 9 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆102Updated last month
- ☆51Updated 7 months ago
- ☆33Updated 4 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆90Updated last month
- ☆78Updated 9 months ago
- Focused on fast experimentation and simplicity☆64Updated 3 weeks ago
- ☆49Updated 10 months ago
- ☆50Updated 3 months ago
- Understand and test language model architectures on synthetic tasks.☆175Updated this week
- WIP☆92Updated 5 months ago
- ☆19Updated 3 months ago
- ☆20Updated last year
- Minimal but scalable implementation of large language models in JAX☆28Updated 2 months ago
- ☆146Updated last month
- Experiment of using Tangent to autodiff triton☆74Updated 11 months ago
- Fast, Modern, Memory Efficient, and Low Precision PyTorch Optimizers☆77Updated 6 months ago
- Official repository for the paper "Approximating Two-Layer Feedforward Networks for Efficient Transformers"☆36Updated last year
- Token Omission Via Attention☆122Updated 3 months ago
- 🧱 Modula software package☆132Updated this week
- Latent Diffusion Language Models☆68Updated last year
- Explorations into the recently proposed Taylor Series Linear Attention☆91Updated 4 months ago
- Collection of autoregressive model implementation☆76Updated last week
- ☆69Updated 4 months ago
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆30Updated last month
- A basic pure pytorch implementation of flash attention☆16Updated 2 months ago
- Triton Implementation of HyperAttention Algorithm☆46Updated last year
- LoRA for arbitrary JAX models and functions☆135Updated 10 months ago