vinayprabhu / X-is-all-you-needLinks
A survey of all the 'X is all you need'
☆75Updated 4 years ago
Alternatives and similar repositories for X-is-all-you-need
Users that are interested in X-is-all-you-need are comparing it to the libraries listed below
Sorting:
- a lightweight transformer library for PyTorch☆72Updated 4 years ago
- Context Manager to profile the forward and backward times of PyTorch's nn.Module☆82Updated 2 years ago
- Functional deep learning☆108Updated 3 years ago
- Differentiable Algorithms and Algorithmic Supervision.☆116Updated 2 years ago
- A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-…☆66Updated 2 years ago
- Large dataset storage format for Pytorch☆45Updated 4 years ago
- Convert scikit-learn models to PyTorch modules☆166Updated last year
- Code for the anonymous submission "Cockpit: A Practical Debugging Tool for Training Deep Neural Networks"☆31Updated 4 years ago
- Lightweight ML Experiment Logging 📖☆81Updated last year
- A case study of efficient training of large language models using commodity hardware.☆68Updated 3 years ago
- ☆67Updated 7 months ago
- Yet another mini autodiff system for educational purposes☆30Updated last week
- Python Research Framework☆106Updated 3 years ago
- A GPT, made only of MLPs, in Jax☆58Updated 4 years ago
- Toy implementations of some popular ML optimizers using Python/JAX☆44Updated 4 years ago
- pytest plugin for a better developer experience when working with the PyTorch test suite☆44Updated 3 years ago
- A framework for implementing equivariant DL☆10Updated 4 years ago
- Cyclemoid implementation for PyTorch☆90Updated 3 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- Annotate python source code☆69Updated 5 years ago
- An active learning library for Pytorch based on Lightning-Fabric.☆79Updated last year
- Image augmentation library for Jax☆40Updated last year
- EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax☆129Updated last year
- Official code repository of the paper Linear Transformers Are Secretly Fast Weight Programmers.☆110Updated 4 years ago
- "Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices", official implementation☆29Updated 9 months ago
- Differentiable Sorting Networks☆124Updated 2 years ago
- Pytorch Lightning Distributed Accelerators using Ray☆215Updated 2 years ago
- ☆192Updated 4 months ago
- A collection of optimizers, some arcane others well known, for Flax.☆29Updated 4 years ago
- Search for scientific papers on the command line☆106Updated this week