ernoult / updatesEPgradientsBPTTLinks
Updates of Equilibrium Prop Match Gradients of Backprop Through Time in an RNN with Static Input (NeurIPS 2019)
☆14Updated last year
Alternatives and similar repositories for updatesEPgradientsBPTT
Users that are interested in updatesEPgradientsBPTT are comparing it to the libraries listed below
Sorting:
- Fully documented Pytorch implementation of the Equilibrium Propagation algorithm.☆35Updated 5 years ago
- A PyTorch implementation of EventProp [https://arxiv.org/abs/2009.08378], a method to train Spiking Neural Networks☆49Updated 4 years ago
- ☆28Updated 6 years ago
- Learning to Learn: Gradient-free Optimization framework☆36Updated 3 years ago
- Implementation of feedback alignment learning in PyTorch☆31Updated 2 years ago
- ☆24Updated 4 years ago
- ☆17Updated 4 years ago
- Study on the applicability of Direct Feedback Alignment to neural view synthesis, recommender systems, geometric learning, and natural la…☆88Updated 3 years ago
- Automatic Hebbian learning in multi-layer convolutional networks with PyTorch, by expressing Hebbian plasticity rules as gradients☆38Updated 2 years ago
- Official code for Coupled Oscillatory RNN (ICLR 2021, Oral)☆45Updated 3 years ago
- BrainProp: How the brain can implement reward-based error backpropagation☆16Updated 2 years ago
- Testing Difference Target Propagation (DTP) on MNIST.☆12Updated 4 years ago
- Python implementation of the methods in Meulemans et al. 2020 - A Theoretical Framework For Target Propagation☆32Updated 7 months ago
- BioTorch is a PyTorch framework specializing in biologically plausible learning algorithms☆49Updated last year
- Auryn-based simulation of multiplexing and burst-dependent plasticity☆23Updated 4 years ago
- Code for Limbacher, T. and Legenstein, R. (2020). H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks☆13Updated 3 years ago
- PyTorch-based code for training fully-connected and convolutional networks using backpropagation (BP), feedback alignment (FA), direct fe…☆66Updated 4 years ago