google-deepmind / transformer_grammarsLinks
Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)
☆127Updated last month
Alternatives and similar repositories for transformer_grammars
Users that are interested in transformer_grammars are comparing it to the libraries listed below
Sorting:
- Minimum Bayes Risk Decoding for Hugging Face Transformers☆58Updated last year
- ☆66Updated 2 years ago
- ☆67Updated 2 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- The official code of LM-Debugger, an interactive tool for inspection and intervention in transformer-based language models.☆178Updated 3 years ago
- ☆54Updated 2 years ago
- Utilities for the HuggingFace transformers library☆70Updated 2 years ago
- Python library which enables complex compositions of language models such as scratchpads, chain of thought, tool use, selection-inference…☆208Updated 2 months ago
- ☆72Updated 2 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆79Updated 3 years ago
- Helper scripts and notes that were used while porting various nlp models☆45Updated 3 years ago
- Simple-to-use scoring function for arbitrarily tokenized texts.☆45Updated 5 months ago
- ☆51Updated 2 years ago
- A diff tool for language models☆43Updated last year
- A Python library that encapsulates various methods for neuron interpretation and analysis in Deep NLP models.☆102Updated last year
- Amos optimizer with JEstimator lib.☆82Updated last year
- ☆166Updated 2 years ago
- A Toolkit for Distributional Control of Generative Models☆73Updated last week
- ☆100Updated 2 years ago
- Pipeline for pulling and processing online language model pretraining data from the web☆177Updated 2 years ago
- [NeurIPS 2023] Learning Transformer Programs☆162Updated last year
- Code for the paper "The Impact of Positional Encoding on Length Generalization in Transformers", NeurIPS 2023☆136Updated last year
- ☆48Updated last year
- Datasets collection and preprocessings framework for NLP extreme multitask learning☆185Updated last month
- Official code and model checkpoints for our EMNLP 2022 paper "RankGen - Improving Text Generation with Large Ranking Models" (https://arx…☆136Updated 2 years ago
- Implementation of the GBST block from the Charformer paper, in Pytorch☆118Updated 4 years ago
- QAmeleon introduces synthetic multilingual QA data using PaLM, a 540B large language model. This dataset was generated by prompt tuning P…☆34Updated last year
- Apps built using Inspired Cognition's Critique.☆58Updated 2 years ago
- An instruction-based benchmark for text improvements.☆141Updated 2 years ago
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆82Updated 3 years ago