galsang / trees_from_transformersLinks
Official code for the ICLR 2020 paper 'ARE PPE-TRAINED LANGUAGE MODELS AWARE OF PHRASES? SIMPLE BUT STRONG BASELINES FOR GRAMMAR INDCUTION'.
☆30Updated last year
Alternatives and similar repositories for trees_from_transformers
Users that are interested in trees_from_transformers are comparing it to the libraries listed below
Sorting:
- Code for "Understanding Neural Abstractive Summarization Models via Uncertainty" (EMNLP20)☆30Updated 4 years ago
- Implementation of ICLR 2020 paper "Revisiting Self-Training for Neural Sequence Generation"☆46Updated 2 years ago
- ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation☆25Updated 4 years ago
- ☆42Updated 4 years ago
- ☆45Updated 3 years ago
- Explicit Alignment Objectives for Multilingual Bidirectional Encoders☆14Updated 4 years ago
- ☆41Updated 4 years ago
- ReConsider is a re-ranking model that re-ranks the top-K (passage, answer-span) predictions of an Open-Domain QA Model like DPR (Karpukhi…☆49Updated 4 years ago
- PyTorch implementation of A Surprisingly Effective Fix for Deep Latent Variable Modeling of Text (EMNLP 2019)☆48Updated 5 years ago
- Implementation of the paper: Specializing pretrained word embeddings (for parsing) by IB☆52Updated 2 years ago
- ☆31Updated 5 years ago
- The Return of Lexical Dependencies: Neural Lexicalized PCFGs (TACL)☆33Updated 3 years ago
- Differentiable Perturb-and-Parse operator☆25Updated 6 years ago
- The implementation of "Neural Machine Translation without Embeddings", NAACL 2021☆33Updated 3 years ago
- Posterior Control of Blackbox Generation