elephantmipt / bert-distillationLinks
Distillation of BERT model with catalyst framework
☆78Updated 2 years ago
Alternatives and similar repositories for bert-distillation
Users that are interested in bert-distillation are comparing it to the libraries listed below
Sorting:
- A small library with distillation, quantization and pruning pipelines☆26Updated 4 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.☆105Updated 3 years ago
- LM Pretraining with PyTorch/TPU☆136Updated 6 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Updated 4 years ago
- On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines☆137Updated 2 years ago
- 🛠️ Tools for Transformers compression using PyTorch Lightning ⚡☆85Updated this week
- Pytorch library for end-to-end transformer models training, inference and serving☆70Updated 7 months ago
- ☆87Updated 3 years ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆157Updated last year
- Repository with illustrations for cft-contest-2018☆12Updated 7 years ago
- http://nlp.seas.harvard.edu/2018/04/03/attention.html☆62Updated 4 years ago
- Language Modeling Example with Transformers and PyTorch Lighting☆65Updated 5 years ago
- (re)Implementation of Learning Multi-level Dependencies for Robust Word Recognition☆17Updated last year
- Dual Encoders for State-of-the-art Natural Language Processing.☆61Updated 3 years ago
- ☆21Updated 4 years ago
- A lightweight but powerful library to build token indices for NLP tasks, compatible with major Deep Learning frameworks like PyTorch and …☆51Updated last year
- A 🤗-style implementation of BERT using lambda layers instead of self-attention☆69Updated 5 years ago
- Fine-tune transformers with pytorch-lightning☆44Updated 3 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 3 years ago
- A deep learning library based on Pytorch focussed on low resource language research and robustness☆70Updated 4 years ago
- ☆75Updated 4 years ago
- This repository contains the code for running the character-level Sandwich Transformers from our ACL 2020 paper on Improving Transformer …☆55Updated 4 years ago
- A library to conduct ranking experiments with transformers.☆160Updated 2 years ago
- Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding (AAAI 2020) - PyTorch Implementation☆34Updated 2 years ago
- Accelerated NLP pipelines for fast inference on CPU. Built with Transformers and ONNX runtime.☆127Updated 5 years ago
- Implementation of Mixout with PyTorch☆75Updated 2 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- ML Reproducibility Challenge 2020: Electra reimplementation using PyTorch and Transformers☆12Updated 4 years ago
- Helper scripts and notes that were used while porting various nlp models☆48Updated 3 years ago
- Viewer for the 🤗 datasets library.☆86Updated 4 years ago