bhoov / hamux
Hierarchical Associative Memory User Experience
☆100Updated last year
Alternatives and similar repositories for hamux:
Users that are interested in hamux are comparing it to the libraries listed below
- The Energy Transformer block, in JAX☆56Updated last year
- Official Implementation of the ICML 2023 paper: "Neural Wave Machines: Learning Spatiotemporally Structured Representations with Locally …☆70Updated last year
- ☆192Updated 10 months ago
- Loopy belief propagation for factor graphs on discrete variables in JAX☆144Updated 5 months ago
- Running Jax in PyTorch Lightning☆89Updated 3 months ago
- ☆112Updated last month
- JAX Arrays for human consumption☆90Updated last year
- Non official implementation of the Linear Recurrent Unit (LRU, Orvieto et al. 2023)☆52Updated 4 months ago
- A Python package of computer vision models for the Equinox ecosystem.☆103Updated 8 months ago
- Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation precondition…☆169Updated 3 months ago
- ☆154Updated last year
- Code for the paper "Predictive Coding Approximates Backprop along Arbitrary Computation Graphs"☆151Updated 4 years ago
- ☆245Updated 5 months ago
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- Mathematical operations for JAX pytrees☆198Updated 3 months ago
- Compositional Linear Algebra☆465Updated this week
- A functional training loops library for JAX☆86Updated last year
- Predictive Coding JAX-based library☆48Updated 3 weeks ago
- Parameter-Free Optimizers for Pytorch☆122Updated 10 months ago
- Example of Dense Associative Memory training on MNIST☆35Updated 2 years ago
- Bayesian inference with Python and Jax.☆32Updated 2 years ago
- Official repository for the paper "Can You Learn an Algorithm? Generalizing from Easy to Hard Problems with Recurrent Networks"☆60Updated 3 years ago
- ☆59Updated 3 years ago
- 🧱 Modula software package☆172Updated last week
- Brain-Inspired Modular Training (BIMT), a method for making neural networks more modular and interpretable.☆165Updated last year
- Composable kernels for scikit-learn implemented in JAX.☆43Updated 4 years ago
- ☆220Updated last month
- Image augmentation library for Jax☆38Updated 11 months ago
- Neural Networks for JAX☆83Updated 5 months ago
- Differentiable Algorithms and Algorithmic Supervision.☆113Updated last year