ltgoslo / elc-bert
☆19Updated last week
Alternatives and similar repositories for elc-bert:
Users that are interested in elc-bert are comparing it to the libraries listed below
- The evaluation pipeline for the 2024 BabyLM Challenge.☆30Updated 5 months ago
- Minimum Bayes Risk Decoding for Hugging Face Transformers☆57Updated 10 months ago
- Simple-to-use scoring function for arbitrarily tokenized texts.☆39Updated 2 months ago
- Evaluation pipeline for the BabyLM Challenge 2023.☆75Updated last year
- A Python library that encapsulates various methods for neuron interpretation and analysis in Deep NLP models.☆102Updated last year
- ☆34Updated 10 months ago
- ☆51Updated 11 months ago
- LTG-Bert☆32Updated last year
- Utility for behavioral and representational analyses of Language Models☆138Updated this week
- [NAACL 2022] GlobEnc: Quantifying Global Token Attribution by Incorporating the Whole Encoder Layer in Transformers☆21Updated last year
- ☆28Updated 3 months ago
- Rust library for indexing and quickly searching large pretraining corpora☆26Updated this week
- CausalGym: Benchmarking causal interpretability methods on linguistic tasks☆41Updated 4 months ago
- Official Repository of Pretraining Without Attention (BiGS), BiGS is the first model to achieve BERT-level transfer learning on the GLUE …☆116Updated last year
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)☆125Updated 6 months ago
- Utilities for the HuggingFace transformers library☆67Updated 2 years ago
- ☆62Updated 2 years ago
- ☆13Updated 2 weeks ago
- Experiments for efforts to train a new and improved t5☆77Updated last year
- A python package to run inference with HuggingFace language and vision-language checkpoints wrapping many convenient features.☆27Updated 7 months ago
- Collection of academic works in natural language processing, computational linguistics, and computational cognitive science that study th…☆18Updated last year
- Official implementation of "GPT or BERT: why not both?"☆52Updated last month
- How do transformer LMs encode relations?☆47Updated last year
- ☆34Updated last year
- Code for Zero-Shot Tokenizer Transfer☆127Updated 3 months ago
- ☆38Updated last year
- Materials for "Prompting is not a substitute for probability measurements in large language models" (EMNLP 2023)☆23Updated last year
- Efficient Transformers with Dynamic Token Pooling☆60Updated last year
- ☆114Updated 8 months ago
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆71Updated last year