s-nlp / annotated-transformerLinks
http://nlp.seas.harvard.edu/2018/04/03/attention.html
☆62Updated 4 years ago
Alternatives and similar repositories for annotated-transformer
Users that are interested in annotated-transformer are comparing it to the libraries listed below
Sorting:
- Visualising the Transformer encoder☆111Updated 4 years ago
- Distillation of BERT model with catalyst framework☆78Updated 2 years ago
- A tiny Catalyst-like experiment runner framework on top of micrograd.☆51Updated 4 years ago
- ☆103Updated 4 years ago
- A 🤗-style implementation of BERT using lambda layers instead of self-attention☆69Updated 4 years ago
- diagNNose is a Python library that facilitates a broad set of tools for analysing hidden activations of neural models.☆82Updated last year
- Create interactive textual heat maps for Jupiter notebooks☆196Updated last year
- Pytorch library for end-to-end transformer models training, inference and serving☆70Updated 2 months ago
- Docs☆144Updated 7 months ago
- A small library with distillation, quantization and pruning pipelines☆26Updated 4 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Updated 3 years ago
- A collection of code snippets for my PyTorch Lightning projects☆108Updated 4 years ago
- LM Pretraining with PyTorch/TPU☆134Updated 5 years ago
- Theoretical Deep Learning: generalization ability☆46Updated 5 years ago
- (re)Implementation of Learning Multi-level Dependencies for Robust Word Recognition☆17Updated 11 months ago
- Code for BERT classifier finetuning for multiclass text classification☆70Updated last month
- ☆64Updated 5 years ago
- HetSeq: Distributed GPU Training on Heterogeneous Infrastructure☆106Updated 2 years ago
- A deep learning library based on Pytorch focussed on low resource language research and robustness☆70Updated 3 years ago
- Deep Learning for Natural Language Processing - Lectures 2023☆174Updated 10 months ago
- XAI Tutorial for the Explainable AI track in the ALPS winter school 2021☆58Updated 4 years ago
- Fastai community entry to 2020 Reproducibility Challenge☆17Updated 2 years ago
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.☆55Updated 3 years ago
- Code for scaling Transformers☆26Updated 4 years ago
- ☆108Updated 2 years ago
- Learning to Initialize Neural Networks for Stable and Efficient Training☆139Updated 3 years ago
- Implementation of Feedback Transformer in Pytorch☆107Updated 4 years ago
- Training Transformer-XL on 128 GPUs☆140Updated 5 years ago
- The "tl;dr" on a few notable transformer papers (pre-2022).☆190Updated 2 years ago
- Self-training with Weak Supervision (NAACL 2021)☆160Updated last year