lucas-maes / nano-simsiam
Minimalistic, hackable PyTorch implementation of SimSiam in ~400 lines. Achieves good performance on ImageNet with ResNet50. Features distributed training, real-time KNN eval, and AMP. Perfect for research prototyping.
☆19Updated last month
Alternatives and similar repositories for nano-simsiam:
Users that are interested in nano-simsiam are comparing it to the libraries listed below
- A simple hypernetwork implementation in jax using haiku.☆23Updated 2 years ago
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- An implementation of PSGD Kron second-order optimizer for PyTorch☆22Updated 2 weeks ago
- GPT implementation in Flax☆18Updated 3 years ago
- Unofficial but Efficient Implementation of "Mamba: Linear-Time Sequence Modeling with Selective State Spaces" in JAX☆82Updated 11 months ago
- Code for "Meta Learning Backpropagation And Improving It" @ NeurIPS 2021 https://arxiv.org/abs/2012.14905☆31Updated 3 years ago
- ☆37Updated last year
- ☆58Updated 2 years ago
- Quantification of Uncertainty with Adversarial Models☆27Updated last year
- Running Jax in PyTorch Lightning☆83Updated last month
- The Energy Transformer block, in JAX☆54Updated last year
- Code for https://arxiv.org/abs/2406.04329☆51Updated last month
- Fine-grained, dynamic control of neural network topology in JAX.☆21Updated last year
- Graph neural networks in JAX.☆67Updated 7 months ago
- ☆31Updated last month
- flexible meta-learning in jax☆12Updated last year
- Lightning-like training API for JAX with Flax☆36Updated last month
- Building blocks for productive research☆47Updated this week
- Codes accompanying the paper "LaProp: a Better Way to Combine Momentum with Adaptive Gradient"☆26Updated 4 years ago
- ☆29Updated 3 months ago
- ☆33Updated last year
- Hierarchical Associative Memory User Experience☆94Updated last year
- Pytorch-like dataloaders in JAX.☆67Updated 3 months ago
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆30Updated last month
- ☆15Updated 4 months ago
- ☆40Updated last month
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆31Updated last year
- Clockwork VAEs in JAX/Flax☆32Updated 3 years ago
- An implementation of the Llama architecture, to instruct and delight☆21Updated this week
- ☆50Updated 3 months ago