machelreid / diffuser
DiffusER: Discrete Diffusion via Edit-based Reconstruction (Reid, Hellendoorn & Neubig, 2022)
☆54Updated 2 years ago
Alternatives and similar repositories for diffuser:
Users that are interested in diffuser are comparing it to the libraries listed below
- Semi-autoregressive Simplex-based Diffusion Language Model for Text Generation and Modular Control☆69Updated 2 years ago
- ☆99Updated 2 years ago
- Language modeling via stochastic processes. Oral @ ICLR 2022.☆138Updated last year
- ☆106Updated 2 years ago
- PyTorch reimplementation of REALM and ORQA☆22Updated 3 years ago
- DEMix Layers for Modular Language Modeling☆53Updated 3 years ago
- ☆59Updated 2 years ago
- [NeurIPS 2023] Repetition In Repetition Out: Towards Understanding Neural Text Degeneration from the Data Perspective☆30Updated last year
- Code for the ICLR'22 paper "Improving Non-Autoregressive Translation Models Without Distillation"☆18Updated 3 years ago
- Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding☆18Updated 2 years ago
- Reparameterized Discrete Diffusion Models for Text Generation☆98Updated 2 years ago
- ☆75Updated last year
- Text Diffusion Model with Encoder-Decoder Transformers for Sequence-to-Sequence Generation [NAACL 2024]☆94Updated last year
- The original Backpack Language Model implementation, a fork of FlashAttention☆67Updated last year
- ☆128Updated 2 years ago
- No Parameters Left Behind: Sensitivity Guided Adaptive Learning Rate for Training Large Transformer Models (ICLR 2022)☆30Updated 3 years ago
- ☆21Updated 2 years ago
- ☆25Updated last year
- Source code of LatentOps☆78Updated last year
- An Empirical Study On Contrastive Search And Contrastive Decoding For Open-ended Text Generation☆27Updated 10 months ago
- [ICML 2022] Latent Diffusion Energy-Based Model for Interpretable Text Modeling☆65Updated 2 years ago
- NAACL 2022: MCSE: Multimodal Contrastive Learning of Sentence Embeddings☆55Updated 10 months ago
- This is the oficial repository for "Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft Prompts" (EMNLP 2022)