Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
☆378Feb 6, 2024Updated 2 years ago
Alternatives and similar repositories for attention-mechanisms
Users that are interested in attention-mechanisms are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Keras Attention Layer (Luong and Bahdanau scores).☆2,814Mar 12, 2026Updated last month
- Attention mechanism for processing sequential data that considers the context for each timestamp.☆657Jan 22, 2022Updated 4 years ago
- Comparatively fine-tuning pretrained BERT models on downstream, text classification tasks with different architectural configurations in …☆126Jul 2, 2020Updated 5 years ago
- Keras Layer implementation of Attention for Sequential models☆444Mar 25, 2023Updated 3 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆128Sep 23, 2021Updated 4 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- ☆16May 23, 2018Updated 7 years ago
- State of the art faster Transformer with Tensorflow 2.0 ( NLP, Computer Vision, Audio ).☆85Mar 16, 2023Updated 3 years ago
- A wrapper layer for stacking layers horizontally☆228Jan 22, 2022Updated 4 years ago
- ☆41Jun 5, 2022Updated 3 years ago
- Applying the Trading Deep Q-Network algorithm (TDQN) on shares in the hydrogen sector.☆11Nov 11, 2020Updated 5 years ago
- ☆14Feb 26, 2019Updated 7 years ago
- MinScIE is an Open Information Extraction system which provides structured knowledge enriched with semantic information about citations.☆15Jun 9, 2019Updated 6 years ago
- Visualizing RNNs using the attention mechanism☆749Jun 25, 2019Updated 6 years ago
- Implementation of Hierarchical Attention Networks in PyTorch☆129Oct 22, 2018Updated 7 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Temporal Pattern Attention for Multivariate Time Series Forecasting☆733Nov 29, 2018Updated 7 years ago
- Extracting Entities with Limited Evidence☆16Dec 26, 2022Updated 3 years ago
- Transformer implemented in Keras☆368Jan 22, 2022Updated 4 years ago
- A list of resources about Text Style Transfer☆59May 7, 2020Updated 5 years ago
- ☆11Jun 17, 2024Updated last year
- ☆31Jun 2, 2018Updated 7 years ago
- All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.☆235Jan 10, 2020Updated 6 years ago
- Official implementation of AGSTN model(ICDM2020)☆12Sep 12, 2020Updated 5 years ago
- Learning from graph data using Keras☆64May 9, 2019Updated 6 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- next-item recommendations in short sessions☆10Sep 24, 2022Updated 3 years ago
- A Hyperparameter Tuning Library for Keras☆2,925Dec 1, 2025Updated 4 months ago
- A Structured Self-attentive Sentence Embedding☆493Sep 22, 2019Updated 6 years ago
- A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need☆715Sep 24, 2021Updated 4 years ago
- ☆11Jun 13, 2017Updated 8 years ago
- A RAG that can scale 🧑🏻💻☆11May 28, 2024Updated last year
- ☆11Jul 13, 2021Updated 4 years ago
- Keras library for building (Universal) Transformers, facilitating BERT and GPT models☆541May 30, 2020Updated 5 years ago
- Training word embeddings using hierarchical softmax with a semantic tree.☆18Sep 8, 2017Updated 8 years ago
- Wordpress hosting with auto-scaling - Free Trial • AdFully Managed hosting for WordPress and WooCommerce businesses that need reliable, auto-scalable performance. Cloudways SafeUpdates now available.
- Keras implementation of Attention Augmented Convolutional Neural Networks☆120Mar 6, 2020Updated 6 years ago
- Dual Staged Attention Model for Time Series prediction☆65Nov 12, 2017Updated 8 years ago
- [ SIGIR '20 ] How Useful are Reviews for Recommendation? A Critical Review and Potential Improvements☆56May 23, 2022Updated 3 years ago
- A Tensorflow 2.0 implementation of TabNet.☆245Apr 27, 2023Updated 2 years ago
- The official implementation of paper "DIANet:Dense-and-Implicit-Attention-Network".☆102Jun 16, 2023Updated 2 years ago
- ☆16May 23, 2020Updated 5 years ago
- PyTorch Tutorial (1.7)☆457Dec 20, 2020Updated 5 years ago