Trel725 / forward-forwardLinks
A simple Python implementation of forward-forward NN training by G. Hinton from NeurIPS 2022
☆21Updated 2 years ago
Alternatives and similar repositories for forward-forward
Users that are interested in forward-forward are comparing it to the libraries listed below
Sorting:
- Revisiting Efficient Training Algorithms For Transformer-based Language Models (NeurIPS 2023)☆80Updated 2 years ago
- Code Release for "Broken Neural Scaling Laws" (BNSL) paper☆59Updated last year
- ☆75Updated 2 years ago
- Experiments on GPT-3's ability to fit numerical models in-context.☆14Updated 3 years ago
- This repository includes code to reproduce the tables in "Loss Landscapes are All You Need: Neural Network Generalization Can Be Explaine…☆40Updated 2 years ago
- Reimplementation of Geoffrey Hinton's Forward-Forward Algorithm☆156Updated last year
- HomebrewNLP in JAX flavour for maintable TPU-Training☆50Updated last year
- [NeurIPS 2023] Learning Transformer Programs☆162Updated last year
- ☆43Updated 3 years ago
- Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amount…☆53Updated last year
- My explorations into editing the knowledge and memories of an attention network☆34Updated 2 years ago
- unofficial re-implementation of "Grokking: Generalization Beyond Overfitting on Small Algorithmic Datasets"☆79Updated 3 years ago
- Code for the paper "The Impact of Positional Encoding on Length Generalization in Transformers", NeurIPS 2023☆135Updated last year
- Explores the ideas presented in Deep Ensembles: A Loss Landscape Perspective (https://arxiv.org/abs/1912.02757) by Stanislav Fort, Huiyi …☆66Updated 5 years ago
- Blog post☆17Updated last year
- The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We s…☆67Updated 2 years ago
- An implementation of Transformer with Expire-Span, a circuit for learning which memories to retain☆34Updated 4 years ago
- Official implementation of the transformer (TF) architecture suggested in a paper entitled "Looped Transformers as Programmable Computers…☆27Updated 2 years ago
- Recycling diverse models☆45Updated 2 years ago
- ☆54Updated 2 years ago
- ☆45Updated 2 years ago
- Universal Neurons in GPT2 Language Models☆30Updated last year
- Why Do We Need Weight Decay in Modern Deep Learning? [NeurIPS 2024]☆68Updated last year
- ☆62Updated 3 years ago
- ☆166Updated 2 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆79Updated 3 years ago
- ☆40Updated last month
- The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We…☆46Updated 2 years ago
- ModelDiff: A Framework for Comparing Learning Algorithms☆59Updated 2 years ago
- ☆85Updated last year