ofirpress / sandwich_transformer
This repository contains the code for running the character-level Sandwich Transformers from our ACL 2020 paper on Improving Transformer Models by Reordering their Sublayers.
☆55Updated 4 years ago
Alternatives and similar repositories for sandwich_transformer:
Users that are interested in sandwich_transformer are comparing it to the libraries listed below
- ☆47Updated 4 years ago
- ☆63Updated 3 years ago
- Factorization of the neural parameter space for zero-shot multi-lingual and multi-task transfer☆39Updated 4 years ago
- The implementation of "Neural Machine Translation without Embeddings", NAACL 2021☆33Updated 3 years ago
- ☆32Updated 3 years ago
- ☆46Updated 5 years ago
- Code for bidirectional sequence generation (BiSon) for generating from BERT pre-trained models.☆51Updated 5 years ago
- ☆31Updated 5 years ago
- ☆31Updated 4 years ago
- Code for EMNLP 2020 paper CoDIR☆41Updated 2 years ago
- DisCo Transformer for Non-autoregressive MT☆77Updated 2 years ago
- Implementation of the retriever distillation procedure as outlined in the paper "Distilling Knowledge from Reader to Retriever"☆32Updated 4 years ago
- Code for the paper "Latent Relation Language Models" at AAAI-20.☆41Updated 4 years ago
- a Pytorch implementation of the Reformer Network (https://openreview.net/pdf?id=rkgNKkHtvB)☆53Updated 2 years ago
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆36Updated 4 years ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆22Updated 2 years ago
- Implementation of Marge, Pre-training via Paraphrasing, in Pytorch☆75Updated 4 years ago
- Non-Monotonic Sequential Text Generation (ICML 2019)