IDSIA / recurrent-fwpLinks
Official repository for the paper "Going Beyond Linear Transformers with Recurrent Fast Weight Programmers" (NeurIPS 2021)
☆50Updated 3 months ago
Alternatives and similar repositories for recurrent-fwp
Users that are interested in recurrent-fwp are comparing it to the libraries listed below
Sorting:
- Official code repository of the paper Linear Transformers Are Secretly Fast Weight Programmers.☆105Updated 4 years ago
- Variational Reinforcement Learning☆16Updated last year
- Implementation of Hierarchical Transformer Memory (HTM) for Pytorch☆75Updated 4 years ago
- Usable implementation of Emerging Symbol Binding Network (ESBN), in Pytorch☆25Updated 4 years ago
- [NeurIPS'20] Code for the Paper Compositional Visual Generation and Inference with Energy Based Models☆46Updated 2 years ago
- [ICML'21] Improved Contrastive Divergence Training of Energy Based Models☆65Updated 3 years ago
- Generalised UDRL☆37Updated 3 years ago
- ☆55Updated 10 months ago
- The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We s…☆67Updated 2 years ago
- ☆23Updated 3 years ago
- An implementation of 2021 paper by Geoffrey Hinton: "How to represent part-whole hierarchies in a neural network" in Pytorch.☆57Updated 4 years ago
- Open source code for paper "On the Learning and Learnability of Quasimetrics".☆31Updated 2 years ago
- Reparameterize your PyTorch modules☆71Updated 4 years ago
- Meta-learning inductive biases in the form of useful conserved quantities.☆37Updated 2 years ago
- Implementation of a Transformer that Ponders, using the scheme from the PonderNet paper☆81Updated 3 years ago
- Code to reproduce the results for Compositional Attention☆60Updated 2 years ago
- Estimating Gradients for Discrete Random Variables by Sampling without Replacement☆40Updated 5 years ago
- Experiments for Meta-Learning Symmetries by Reparameterization☆57Updated 4 years ago
- ☆30Updated 3 years ago
- The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We…☆46Updated last year
- Code for "Recurrent Independent Mechanisms"