zbambergerNLP / principled-pre-trainingLinks
A repository to get acquainted with basic training tasks in natural language processing and machine learning
☆12Updated last year
Alternatives and similar repositories for principled-pre-training
Users that are interested in principled-pre-training are comparing it to the libraries listed below
Sorting:
- Code associated with the paper "Entropy-based Attention Regularization Frees Unintended Bias Mitigation from Lists"☆50Updated 3 years ago
- ☆55Updated 2 years ago
- Repository collecting resources and best practices to improve experimental rigour in deep learning research.☆27Updated 2 years ago
- Utilities for the HuggingFace transformers library☆72Updated 2 years ago
- This repository accompanies our paper “Do Prompt-Based Models Really Understand the Meaning of Their Prompts?”☆85Updated 3 years ago
- diagNNose is a Python library that facilitates a broad set of tools for analysing hidden activations of neural models.☆82Updated 2 years ago
- Measuring if attention is explanation with ROAR☆22Updated 2 years ago
- ☆75Updated 4 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆80Updated 3 years ago
- KnowMAN: Weakly Supervised Multinomial Adversarial Networks☆12Updated 4 years ago
- RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!☆42Updated 2 years ago
- The official code of LM-Debugger, an interactive tool for inspection and intervention in transformer-based language models.☆180Updated 3 years ago
- 🌾 Universal, customizable and deployable fine-grained evaluation for text generation.☆24Updated 2 years ago
- Simple and scalable tools for data-driven pretraining data selection.☆29Updated 5 months ago
- A Python library that encapsulates various methods for neuron interpretation and analysis in Deep NLP models.☆105Updated 2 years ago
- Code for preprint: Summarizing Differences between Text Distributions with Natural Language☆43Updated 2 years ago
- Query-focused summarization data☆42Updated 2 years ago
- State-of-the-art paired encoder and decoder models (17M-1B params)☆53Updated 3 months ago
- Code base for the EMNLP 2021 Findings paper: Cartography Active Learning☆14Updated 5 months ago
- Mechanistic Interpretability for Transformer Models☆53Updated 3 years ago
- Ranking of fine-tuned HF models as base models.☆36Updated 2 months ago
- A diff tool for language models☆44Updated last year
- ☆139Updated 2 years ago
- ☆37Updated last month
- ☆54Updated 2 years ago
- Official codebase accompanying our ACL 2022 paper "RELiC: Retrieving Evidence for Literary Claims" (https://relic.cs.umass.edu).☆20Updated 3 years ago
- M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyer☆54Updated 3 years ago
- Code for paper "Do Language Models Have Beliefs? Methods for Detecting, Updating, and Visualizing Model Beliefs"☆28Updated 3 years ago
- Rationales for Sequential Predictions☆40Updated 3 years ago
- PAIR.withgoogle.com and friend's work on interpretability methods☆214Updated this week