valdersoul / bn-vaeLinks
☆34Updated 5 years ago
Alternatives and similar repositories for bn-vae
Users that are interested in bn-vae are comparing it to the libraries listed below
Sorting:
- Code for ACL2020 "Jointly Masked Sequence-to-Sequence Model for Non-Autoregressive Neural Machine Translation"☆39Updated 5 years ago
- Source Code for DialogWAE: Multimodal Response Generation with Conditional Wasserstein Autoencoder (https://arxiv.org/abs/1805.12352)☆126Updated 7 years ago
- The official Keras implementation of ACL 2020 paper "Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-En…☆48Updated 2 years ago
- Code for Structured Attention for Unsupervised Dialogue Structure Induction.☆34Updated 2 years ago
- Generative Flow based Sequence-to-Sequence Toolkit written in Python.☆246Updated 5 years ago
- This repo provides the code for the ACL 2020 paper "Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEnco…☆55Updated 4 years ago
- EMNLP-2020: Cross-lingual Spoken Language Understanding with Regularized Representation Alignment☆18Updated 4 years ago
- This code repository presents the pytorch implementation of the paper “Implicit Deep Latent Variable Models for Text Generation”(EMNLP 20…☆55Updated 3 years ago
- Dispersed Exponential Family Mixture VAE☆28Updated 5 years ago
- ☆22Updated 3 years ago
- ☆95Updated 7 years ago
- Transformer-Based Conditioned Variational Autoencoder for Story Completion☆94Updated 5 years ago
- code for paper "Improving Sequence-to-Sequence Learning via Optimal Transport"☆68Updated 6 years ago
- Implementation of the paper Tree Transformer☆214Updated 5 years ago
- PyTorch Implementation of "A Hierarchical Latent Structure for Variational Conversation Modeling" (NAACL 2018 Oral)☆173Updated last year
- ☆84Updated 5 years ago
- Implementation of Dual Learning NMT on PyTorch☆163Updated 7 years ago
- Pytorch implementation of Bert and Pals: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning (https://arxiv.org/ab…