JacksonWuxs / Forward-Forward-NetworkLinks
Implementation of Forward Forward Network proposed by Hinton in NIPS 2022.
☆170Updated 2 years ago
Alternatives and similar repositories for Forward-Forward-Network
Users that are interested in Forward-Forward-Network are comparing it to the libraries listed below
Sorting:
- Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch☆328Updated 6 months ago
- ☆196Updated last year
- Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch☆362Updated last year
- Reimplementation of Geoffrey Hinton's Forward-Forward Algorithm☆156Updated last year
- ICLR 2023 Paper submission analysis from https://openreview.net/group?id=ICLR.cc/2023/Conference☆106Updated 3 years ago
- [EVA ICLR'23; LARA ICML'22] Efficient attention mechanisms via control variates, random features, and importance sampling☆86Updated 2 years ago
- ☆142Updated last year
- Crawl & Visualize ICLR 2023 Data from OpenReview☆85Updated 2 years ago
- ☆21Updated 2 years ago
- PyTorch implementation of "From Sparse to Soft Mixtures of Experts"☆64Updated 2 years ago
- Implementation of Block Recurrent Transformer - Pytorch☆221Updated last year
- ☆292Updated 10 months ago
- ☆33Updated 4 years ago
- Implementation of Linformer for Pytorch☆299Updated last year
- Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"☆369Updated 2 years ago
- The official PyTorch implementation of the paper: Xili Dai, Shengbang Tong, et al. "Closed-Loop Data Transcription to an LDR via Minimaxi…☆63Updated 2 years ago
- Official implementation of TransNormerLLM: A Faster and Better LLM☆247Updated last year
- [EMNLP 2022] Official implementation of Transnormer in our EMNLP 2022 paper - The Devil in Linear Transformer☆63Updated 2 years ago
- ☆65Updated 3 years ago
- ☆105Updated last year
- Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch☆417Updated 9 months ago
- PyTorch implementation of LIMoE☆52Updated last year
- PyTorch repository for ICLR 2022 paper (GSAM) which improves generalization (e.g. +3.8% top-1 accuracy on ImageNet with ViT-B/32)☆144Updated 3 years ago
- [ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal…☆55Updated 2 years ago
- [ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention☆195Updated 2 years ago
- Recurrent Memory Transformer☆150Updated 2 years ago
- Implementation of "Attention Is Off By One" by Evan Miller☆196Updated 2 years ago
- A repository for DenseSSMs☆88Updated last year
- ICLR2023 statistics☆59Updated last year
- Implementation of Infini-Transformer in Pytorch☆113Updated 9 months ago