loeweX / Forward-Forward
Reimplementation of Geoffrey Hinton's Forward-Forward Algorithm
☆132Updated last year
Related projects ⓘ
Alternatives and complementary repositories for Forward-Forward
- Forward Pass Learning and Inference Library, for neural networks and general intelligence, Signal Propagation (sigprop)☆45Updated last year
- Implementation/simulation of the predictive forward-forward credit assignment algorithm for training neurobiologically-plausible recurren…☆55Updated last year
- Implementation of Forward Forward Network proposed by Hinton in NIPS 2022.☆162Updated last year
- ☆207Updated 7 months ago
- Explorations with Geoffrey Hinton's Forward Forward algoithm☆33Updated 10 months ago
- ☆263Updated 3 months ago
- Spyx: Spiking Neural Networks in JAX☆102Updated last month
- ☆58Updated 2 years ago
- seqax = sequence modeling + JAX☆134Updated 4 months ago
- NanoGPT-like codebase for LLM training☆75Updated this week
- Parallelizing non-linear sequential models over the sequence length☆45Updated 3 weeks ago
- Understand and test language model architectures on synthetic tasks.☆163Updated 6 months ago
- ☆46Updated last month
- Emergent world representations: Exploring a sequence model trained on a synthetic task☆170Updated last year
- ☆132Updated last year
- ☆66Updated 10 months ago
- A centralized place for deep thinking code and experiments☆77Updated last year
- Easy Hypernetworks in Pytorch and Jax☆96Updated last year
- Replicating and dissecting the git-re-basin project in one-click-replication Colabs☆36Updated 2 years ago
- This repository contains the code for the paper "Inferring Neural Activity Before Plasticity: A Foundation for Learning Beyond Backpropag…☆80Updated 10 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆95Updated 6 months ago
- ☆198Updated 4 months ago
- ☆194Updated last year
- Neural Networks and the Chomsky Hierarchy☆187Updated 7 months ago
- A repository for log-time feedforward networks☆216Updated 7 months ago
- ☆109Updated last year
- PyTorch implementation of Mixer-nano (#parameters is 0.67M, originally Mixer-S/16 has 18M) with 90.83 % acc. on CIFAR-10. Training from s…☆28Updated 3 years ago
- A simple library for scaling up JAX programs☆127Updated 3 weeks ago
- ☆161Updated last year
- ☆199Updated 6 months ago