The accompanying code for "Transformer Feed-Forward Layers Are Key-Value Memories". Mor Geva, Roei Schuster, Jonathan Berant, and Omer Levy. EMNLP, 2021.
☆99Sep 5, 2021Updated 4 years ago
Alternatives and similar repositories for ff-layers
Users that are interested in ff-layers are comparing it to the libraries listed below
Sorting:
- ☆68May 18, 2023Updated 2 years ago
- Code for the ACL-2022 paper "Knowledge Neurons in Pretrained Transformers"☆173May 4, 2024Updated last year
- ☆57Jun 15, 2023Updated 2 years ago
- This repository includes code for the paper "Does Localization Inform Editing? Surprising Differences in Where Knowledge Is Stored vs. Ca…☆61May 9, 2023Updated 2 years ago
- Locating and editing factual associations in GPT (NeurIPS 2022)☆728Apr 20, 2024Updated last year
- The official code of LM-Debugger, an interactive tool for inspection and intervention in transformer-based language models.☆184May 13, 2022Updated 3 years ago
- ☆16Jul 10, 2023Updated 2 years ago
- ☆16May 14, 2024Updated last year
- ☆14Apr 27, 2022Updated 3 years ago
- Redwood Research's transformer interpretability tools☆15Apr 15, 2022Updated 3 years ago
- ☆29Apr 30, 2024Updated last year
- [ICLR 2022] Code for paper "Exploring Extreme Parameter Compression for Pre-trained Language Models"(https://arxiv.org/abs/2205.10036)☆22May 24, 2023Updated 2 years ago
- Official codebase accompanying our ACL 2022 paper "RELiC: Retrieving Evidence for Literary Claims" (https://relic.cs.umass.edu).☆20May 14, 2022Updated 3 years ago
- Tools for understanding how transformer predictions are built layer-by-layer☆567Aug 7, 2025Updated 6 months ago
- Dataset for Unified Editing, EMNLP 2023. This is a model editing dataset where edits are natural language phrases.☆23Sep 4, 2024Updated last year
- Code and test data for "On Measuring Bias in Sentence Encoders", to appear at NAACL 2019.☆56May 23, 2021Updated 4 years ago
- Embedding Recycling for Language models☆38Jul 11, 2023Updated 2 years ago
- This package implements THOR: Transformer with Stochastic Experts.☆65Oct 7, 2021Updated 4 years ago