gonglinyuan / StackingBERTView external linksLinks
Source code for "Efficient Training of BERT by Progressively Stacking"
☆113Jul 3, 2019Updated 6 years ago
Alternatives and similar repositories for StackingBERT
Users that are interested in StackingBERT are comparing it to the libraries listed below
Sorting:
- Codes for "Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View"☆147Jun 10, 2019Updated 6 years ago
- Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)☆102Nov 2, 2020Updated 5 years ago
- (CVPR 2022) Automated Progressive Learning for Efficient Training of Vision Transformers☆25Feb 26, 2025Updated 11 months ago
- Repository for ACL 2019 paper☆74Jun 30, 2019Updated 6 years ago
- ☆11Nov 13, 2020Updated 5 years ago
- MLPs for Vision and Langauge Modeling (Coming Soon)☆27Dec 9, 2021Updated 4 years ago
- Tensorflow Source code for "Recurrently Controlled Recurrent Networks" (NIPS 2018)☆23Oct 25, 2018Updated 7 years ago
- Transformer training code for sequential tasks☆610Sep 14, 2021Updated 4 years ago
- Getting interpretable dimensions in word embedding spaces.☆15Jul 6, 2023Updated 2 years ago
- A toolkit for evaluating the linguistic knowledge and transferability of contextual representations. Code for "Linguistic Knowledge and T…☆210Oct 20, 2021Updated 4 years ago
- Single Headed Attention RNN - "Stop thinking with your head"☆1,180Nov 27, 2021Updated 4 years ago
- Neural Text Generation with Unlikelihood Training☆310Aug 31, 2021Updated 4 years ago
- This is the repo for the paper "Revealing the Importance of Semantic Retrieval for Machine Reading at Scale".☆60Nov 25, 2019Updated 6 years ago
- Website for TextVQA dataset.☆28Apr 30, 2023Updated 2 years ago
- NLP library designed for reproducible experimentation management☆294Jul 25, 2024Updated last year
- PyTorch original implementation of Cross-lingual Language Model Pretraining.☆2,924Feb 14, 2023Updated 2 years ago
- MASS: Masked Sequence to Sequence Pre-training for Language Generation☆1,123Nov 28, 2022Updated 3 years ago
- Code for SegTree Transformer (ICLR-RLGM 2019).☆27Nov 12, 2019Updated 6 years ago
- Multi-Task Deep Neural Networks for Natural Language Understanding☆2,257Mar 7, 2024Updated last year
- A pytorch implementation of QAnet☆28Dec 11, 2018Updated 7 years ago
- Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"☆580Aug 28, 2019Updated 6 years ago
- Pytorch version of VidLanKD: Improving Language Understanding viaVideo-Distilled Knowledge Transfer (NeurIPS 2021))☆56Feb 6, 2023Updated 3 years ago
- Code for the RecAdam paper: Recall and Learn: Fine-tuning Deep Pretrained Language Models with Less Forgetting.☆120Nov 10, 2020Updated 5 years ago
- Understanding the Difficulty of Training Transformers☆332May 31, 2022Updated 3 years ago
- Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities"☆1,419Jan 10, 2024Updated 2 years ago
- ☆30Oct 2, 2018Updated 7 years ago
- Code for "Variational Neural Discourse Relation Recognizer" (EMNLP 2016)☆16Dec 29, 2017Updated 8 years ago
- Resources for the MRQA 2019 Shared Task☆294Aug 5, 2021Updated 4 years ago
- Neural Module Network for Reasoning over Text, ICLR 2020☆120Oct 6, 2020Updated 5 years ago
- souce code for "Accelerating Neural Transformer via an Average Attention Network"☆78Jul 3, 2019Updated 6 years ago
- ☆99Jul 7, 2020Updated 5 years ago
- BISON: Binary Image SelectiON☆49Sep 15, 2021Updated 4 years ago
- Do Neural Language Representations Learn Physical Commonsense?☆22Dec 28, 2021Updated 4 years ago
- High performance pytorch modules☆18Jan 14, 2023Updated 3 years ago
- LAnguage Model Analysis☆1,392Jul 7, 2024Updated last year
- Language Model Pruning for Sequence Labeling☆147Feb 29, 2020Updated 5 years ago
- ☆86Mar 11, 2020Updated 5 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Jul 26, 2021Updated 4 years ago
- Auxiliary GAN for WE post-specialisation☆24Feb 22, 2019Updated 6 years ago