uzaymacar / attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
☆345Updated 9 months ago
Related projects ⓘ
Alternatives and complementary repositories for attention-mechanisms
- A wrapper layer for stacking layers horizontally☆227Updated 2 years ago
- Attention mechanism for processing sequential data that considers the context for each timestamp.☆656Updated 2 years ago
- Keras Layer implementation of Attention for Sequential models☆444Updated last year
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆122Updated 3 years ago
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆708Updated 3 years ago
- Transformer implemented in Keras☆370Updated 2 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆533Updated 4 years ago
- attention-based LSTM/Dense implemented by Keras☆294Updated 6 years ago
- An example attention network with simple dataset.☆230Updated 5 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆224Updated 5 years ago
- All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.☆227Updated 4 years ago
- Attention-based bidirectional LSTM for Classification Task (ICASSP)☆108Updated last year
- Implementation of Transformer Model in Tensorflow☆455Updated last year
- Visualizing RNNs using the attention mechanism☆747Updated 5 years ago
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆44Updated 5 years ago
- An LSTM in PyTorch with best practices (weight dropout, forget bias, etc.) built-in. Fully compatible with PyTorch LSTM.☆133Updated 4 years ago
- TensorFlow implementation of focal loss☆186Updated 3 years ago
- 自注意力与文本分类☆119Updated 6 years ago
- LSTM and GRU in PyTorch☆251Updated 5 years ago
- A PyTorch implementation of the TCAN model in "Temporal Convolutional Attention-based Network For Sequence Modeling".☆136Updated last year
- Multi heads attention for image classification☆80Updated 6 years ago
- Word Embedding + LSTM + FC☆159Updated 3 months ago
- Collection of custom layers and utility functions for Keras which are missing in the main framework.☆62Updated 4 years ago
- TensorFlow implementation of 'Attention Is All You Need (2017. 6)'☆349Updated 6 years ago
- ☆76Updated 4 years ago
- Minimal RNN classifier with self-attention in Pytorch☆150Updated 2 years ago
- Implementation of seq2seq with attention in keras☆111Updated 4 years ago
- A repository containing tutorials for practical NLP using PyTorch☆530Updated 5 years ago
- document classification using LSTM + self attention☆112Updated 5 years ago
- A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction☆111Updated 4 months ago