hlamba28 / NMT-with-Attention-Mechanism
In this project I implement Neural Machine Translation using Attention mechanism. The code is written using the TensorFlow library in Python. I have used TensorFlow functionalities like tf.data.Dataset to manage the input pipeline, Eager Execution and Model sub classing to create the model architecture.
☆24Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for NMT-with-Attention-Mechanism
- ☆18Updated 4 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 7 years ago
- This repository contain Code how to use pre-trained word-embedding in tensorflow☆31Updated 6 years ago
- Keras and Tensorflow implementation of Siamese Recurrent Architectures for Learning Sentence Similarity☆47Updated 5 years ago
- Attention based sequence to sequence neural machine translation model built in keras.☆30Updated 6 years ago
- Keras + Universal Sentence Encoder = Transfer Learning for text data☆34Updated 6 years ago
- Translating English sentences to Marathi using Neural Machine Translation☆37Updated 5 years ago
- Collection of Deep Learning Text Classification Models in Keras; Includes a GPU tutorial.☆14Updated 6 years ago
- Transfer Learning for NLP Tasks☆56Updated 5 years ago
- Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.☆166Updated 4 years ago
- This is where I put all my work in Natural Language Processing☆96Updated 3 years ago
- Tensorflow implementation of multi-task learning for language modeling and text classification.☆32Updated 2 years ago
- Collection of Notebooks for Natural Language Processing with PyTorch☆30Updated 5 years ago
- This repository contains the implementation of paper "Hierarchical Attentional Hybrid Neural Networks for Document Classification"☆59Updated 3 years ago
- Machine Translation using Transfromers☆30Updated 4 years ago
- Creating word embeddings from scratch and visualize them on TensorBoard. Using trained embeddings in Keras.☆28Updated 4 years ago
- LM, ULMFit et al.☆47Updated 4 years ago
- TensorFlow implementation of 'Ask Me Anything: Dynamic Memory Networks for Natural Language Processing (2015)'☆42Updated 6 years ago
- ☆32Updated 6 years ago
- A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)☆81Updated 6 years ago
- An Attention Layer in Keras☆43Updated 5 years ago
- Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).☆42Updated 3 years ago
- Experiments of ELMo that deep contextualized word representation in Keras with Tensorflow Hub.☆13Updated 6 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated 2 months ago
- Neural Machine Translation using word level seq2seq model and embeddings☆37Updated 6 years ago
- Tensorflow implementation of Semi-supervised Sequence Learning (https://arxiv.org/abs/1511.01432)☆82Updated 2 years ago
- Gendered Ambiguous Pronouns Shared Task☆31Updated 2 years ago
- shabeelkandi / Handling-Out-of-Vocabulary-Words-in-Natural-Language-Processing-using-Language-Modelling☆68Updated 5 years ago
- complete Jupyter notebook for implementation of state-of-the-art Named Entity Recognition with bidirectional LSTMs and ELMo☆63Updated 5 years ago