greentfrapp / attention-primerLinks
A demonstration of the attention mechanism with some toy experiments and explanations.
☆108Updated 6 years ago
Alternatives and similar repositories for attention-primer
Users that are interested in attention-primer are comparing it to the libraries listed below
Sorting:
- Training Transformer-XL on 128 GPUs☆140Updated 5 years ago
- ☆153Updated 5 years ago
- Configure Python functions explicitly and safely☆126Updated 7 months ago
- learning to search in pytorch☆110Updated 5 years ago
- ☆28Updated 6 years ago
- Experiment orchestration☆103Updated 5 years ago
- The Annotated Encoder Decoder with Attention☆166Updated 4 years ago
- ☆64Updated 5 years ago
- Pip-installable differentiable stacks in PyTorch!☆65Updated 4 years ago
- Visualising the Transformer encoder☆111Updated 4 years ago
- LM Pretraining with PyTorch/TPU☆134Updated 5 years ago
- ☆103Updated 4 years ago
- Probabilistic classification in PyTorch/TensorFlow/scikit-learn with Fenchel-Young losses☆186Updated last year
- A library for evaluating representations.☆76Updated 3 years ago
- PyTorch functions and utilities to make your life easier☆195Updated 4 years ago
- Mixture Density Networks (Bishop, 1994) tutorial in JAX☆59Updated 5 years ago
- ☆45Updated 5 years ago
- ☆37Updated 6 years ago
- ☆21Updated 6 years ago
- ☆40Updated last year
- Python implementation of GLN in different frameworks☆98Updated 4 years ago
- Pre-training of Language Models for Language Understanding☆83Updated 5 years ago
- Generic reinforcement learning codebase in TensorFlow☆95Updated 3 years ago
- Framework-agnostic library for checking array/tensor shapes at runtime.☆46Updated 4 years ago
- Plot TensorBoard graphs fast☆51Updated 3 years ago
- Pytorch Lightning seed project with hydra☆18Updated 4 years ago
- ICLR Reproducibility Challenge 2019☆218Updated 6 years ago
- Docs☆144Updated 7 months ago
- Explorations in building seq2seq models using PyTorch and fast.ai☆14Updated 5 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Updated 3 years ago