YilunZhou / ExSumLinks
Code repository for the NAACL 2022 paper "ExSum: From Local Explanations to Model Understanding"
☆64Updated 3 years ago
Alternatives and similar repositories for ExSum
Users that are interested in ExSum are comparing it to the libraries listed below
Sorting:
- Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.☆44Updated this week
- Lightning template for easy prototyping⚡️☆13Updated 3 years ago
- ☆18Updated 3 years ago
- ☆139Updated 2 years ago
- Embedding Recycling for Language models☆38Updated 2 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆81Updated 3 years ago
- Weakly Supervised End-to-End Learning (NeurIPS 2021)☆157Updated 2 years ago
- Interactive Weak Supervision: Learning Useful Heuristics for Data Labeling☆31Updated 4 years ago
- Ranking of fine-tuned HF models as base models.☆36Updated last month
- Framework for zero-shot learning with knowledge graphs.☆113Updated 2 years ago
- A python package for benchmarking interpretability techniques on Transformers.☆212Updated last year
- Ensembling Hugging Face transformers made easy☆62Updated 2 years ago
- Experiments on GPT-3's ability to fit numerical models in-context.☆14Updated 3 years ago
- A diff tool for language models☆44Updated last year
- ☆13Updated 2 years ago
- Library for creating causal chains using language models.☆81Updated 2 years ago
- ☆21Updated 4 years ago
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.☆55Updated 3 years ago
- Code associated with the paper "Entropy-based Attention Regularization Frees Unintended Bias Mitigation from Lists"☆50Updated 3 years ago
- TimeLMs: Diachronic Language Models from Twitter☆111Updated last year
- Self-training with Weak Supervision (NAACL 2021)☆162Updated 2 years ago
- ☆55Updated 2 years ago
- RATransformers 🐭- Make your transformer (like BERT, RoBERTa, GPT-2 and T5) Relation Aware!☆42Updated 2 years ago
- Measuring if attention is explanation with ROAR☆22Updated 2 years ago
- ☆40Updated 2 years ago
- [NAACL 2021] This is the code for our paper `Fine-Tuning Pre-trained Language Model with Weak Supervision: A Contrastive-Regularized Self…☆205Updated 3 years ago
- Google's BigBird (Jax/Flax & PyTorch) @ 🤗Transformers☆49Updated 2 years ago
- NLP Examples using the 🤗 libraries☆40Updated 4 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging☆68Updated 3 years ago