zaidalyafeai / AttentioNNLinks
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
☆234Updated 5 years ago
Alternatives and similar repositories for AttentioNN
Users that are interested in AttentioNN are comparing it to the libraries listed below
Sorting:
- A simpler version of the self-attention layer from SAGAN, and some image classification results.☆214Updated 6 years ago
- Keras implementation of Attention Augmented Convolutional Neural Networks☆121Updated 5 years ago
- Implements the ideas presented in https://arxiv.org/pdf/2004.11362v1.pdf by Khosla et al.☆133Updated 5 years ago
- Multi-class classification with focal loss for imbalanced datasets☆82Updated 6 years ago
- Semi-Supervised Learning with Ladder Networks in Keras. Get 98% test accuracy on MNIST with just 100 labeled examples !☆101Updated 4 years ago
- Implementation of Rectified Adam in Keras☆70Updated 6 years ago
- Multi heads attention for image classification☆79Updated 7 years ago
- Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with Ten…☆361Updated last year
- Pytorch notebook with One Cycle Policy implementation (https://arxiv.org/abs/1803.09820)