DeLighT: Very Deep and Light-Weight Transformers
☆469Oct 16, 2020Updated 5 years ago
Alternatives and similar repositories for delight
Users that are interested in delight are comparing it to the libraries listed below
Sorting:
- [ICLR 2020] Lite Transformer with Long-Short Range Attention☆610Jul 11, 2024Updated last year
- ☆221Jun 8, 2020Updated 5 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Jul 26, 2021Updated 4 years ago
- Pytorch library for fast transformer implementations☆1,763Mar 23, 2023Updated 2 years ago
- ReConsider is a re-ranking model that re-ranks the top-K (passage, answer-span) predictions of an Open-Domain QA Model like DPR (Karpukhi…☆49Apr 26, 2021Updated 4 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- FastFormers - highly efficient transformer models for NLU☆709Mar 21, 2025Updated 11 months ago
- Longformer: The Long-Document Transformer☆2,188Feb 8, 2023Updated 3 years ago
- SentAugment is a data augmentation technique for NLP that retrieves similar sentences from a large bank of sentences. It can be used in c…☆359Feb 22, 2022Updated 4 years ago
- Repository for the paper "Optimal Subarchitecture Extraction for BERT"☆470Jun 22, 2022Updated 3 years ago
- Transformer training code for sequential tasks☆609Sep 14, 2021Updated 4 years ago
- Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)☆102Nov 2, 2020Updated 5 years ago
- Hopfield Networks is All You Need☆1,900Apr 23, 2023Updated 2 years ago
- An efficient implementation of the popular sequence models for text generation, summarization, and translation tasks. https://arxiv.org/p…☆433Aug 17, 2022Updated 3 years ago
- Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve exis…☆253Nov 8, 2021Updated 4 years ago
- [NAACL 2021] Factual Probing Is [MASK]: Learning vs. Learning to Recall https://arxiv.org/abs/2104.05240☆168Oct 7, 2022Updated 3 years ago
- [ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing☆336Jul 14, 2024Updated last year
- [NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang☆1,690Nov 3, 2022Updated 3 years ago
- [ICML 2020] code for "PowerNorm: Rethinking Batch Normalization in Transformers" https://arxiv.org/abs/2003.07845☆120Jun 20, 2021Updated 4 years ago
- Tracking the progress in non-autoregressive generation (translation, transcription, etc.)☆302Mar 15, 2023Updated 2 years ago
- This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"☆1,628Jun 12, 2023Updated 2 years ago
- Official DeiT repository☆4,325Mar 15, 2024Updated last year
- A masked language modeling objective to train a model to predict any subset of the target words, conditioned on both the input text and a…☆246Sep 17, 2021Updated 4 years ago
- Plug and Play Language Model implementation. Allows to steer topic and attributes of GPT-2 models.☆1,155Feb 20, 2024Updated 2 years ago
- My take on a practical implementation of Linformer for Pytorch.☆421Jul 27, 2022Updated 3 years ago
- Fast Block Sparse Matrices for Pytorch☆549Jan 21, 2021Updated 5 years ago
- The implementation of "Neural Machine Translation without Embeddings", NAACL 2021☆33Jun 9, 2021Updated 4 years ago
- (ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search☆142Dec 6, 2021Updated 4 years ago
- This is the official repository for NAACL 2021, "XOR QA: Cross-lingual Open-Retrieval Question Answering".