PAIR-code / pretraining-tda
☆13Updated 2 weeks ago
Alternatives and similar repositories for pretraining-tda:
Users that are interested in pretraining-tda are comparing it to the libraries listed below
- AI Logging for Interpretability and Explainability🔬☆105Updated 8 months ago
- A library for efficient patching and automatic circuit discovery.☆54Updated 2 weeks ago
- Code for "Tracing Knowledge in Language Models Back to the Training Data"☆37Updated 2 years ago
- ☆45Updated 6 months ago
- ☆46Updated last year
- ☆33Updated last year
- Codebase for ICML submission "DOGE: Domain Reweighting with Generalization Estimation"☆15Updated last year
- ☆58Updated this week
- Simple and scalable tools for data-driven pretraining data selection.☆15Updated 2 weeks ago
- A fusion of a linear layer and a cross entropy loss, written for pytorch in triton.☆62Updated 7 months ago
- ☆34Updated 4 months ago
- ☆89Updated last year
- This repository contains the code used for the experiments in the paper "Fine-Tuning Enhances Existing Mechanisms: A Case Study on Entity…☆23Updated 11 months ago
- ☆36Updated last year
- ☆47Updated last year
- ☆21Updated 5 months ago
- Language models scale reliably with over-training and on downstream tasks☆96Updated 11 months ago
- Open source replication of Anthropic's Crosscoders for Model Diffing☆40Updated 4 months ago
- A Kernel-Based View of Language Model Fine-Tuning https://arxiv.org/abs/2210.05643☆74Updated last year
- Algebraic value editing in pretrained language models☆62Updated last year
- Forcing Diffuse Distributions out of Language Models☆14Updated 5 months ago
- ☆60Updated 3 years ago
- `dattri` is a PyTorch library for developing, benchmarking, and deploying efficient data attribution algorithms.☆65Updated 3 weeks ago
- ☆79Updated 6 months ago
- ☆12Updated 8 months ago
- Skill-It! A Data-Driven Skills Framework for Understanding and Training Language Models☆43Updated last year
- ☆26Updated 7 months ago
- Sparse probing paper full code.☆53Updated last year