hlamba28 / NMT-with-Attention-MechanismLinks
In this project I implement Neural Machine Translation using Attention mechanism. The code is written using the TensorFlow library in Python. I have used TensorFlow functionalities like tf.data.Dataset to manage the input pipeline, Eager Execution and Model sub classing to create the model architecture.
☆24Updated 6 years ago
Alternatives and similar repositories for NMT-with-Attention-Mechanism
Users that are interested in NMT-with-Attention-Mechanism are comparing it to the libraries listed below
Sorting:
- This is where I put all my work in Natural Language Processing☆96Updated 4 years ago
- Goal-Oriented Chatbot trained with Deep Reinforcement Learning☆182Updated 6 years ago
- Transfer Learning for NLP Tasks☆55Updated 6 years ago
- Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.☆166Updated 5 years ago
- State of the Art results in Intent Classification using Sematic Hashing for three datasets: AskUbuntu, Chatbot and WebApplication.☆134Updated 5 years ago
- The implementation of text classification using character level convoultion neural networks using Keras☆150Updated 2 years ago
- Keras + Universal Sentence Encoder = Transfer Learning for text data☆33Updated 6 years ago
- An example on how to train supervised classifiers for multi-label text classification using sklearn pipelines☆110Updated 7 years ago
- Text Generation Using A Variational Autoencoder☆110Updated 8 years ago
- ☆23Updated 7 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 7 years ago
- All my experiments with AI and ML☆119Updated 7 years ago
- Machine Translation using Transfromers☆29Updated 5 years ago
- Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).☆43Updated 4 years ago
- Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2☆64Updated 2 years ago
- NLP model implementations with keras for beginner☆152Updated 2 years ago
- This is our team's solution report, which achieves top 10% (305/3307) in this competition.☆59Updated 8 years ago
- Sequence to Sequence Models in PyTorch☆44Updated last year
- A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python☆102Updated 6 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated 11 months ago
- An Attention Layer in Keras☆43Updated 6 years ago
- LM, ULMFit et al.☆46Updated 5 years ago
- Example showing generalisation☆69Updated 4 years ago
- A simple technique to integrate BERT from tf hub to keras☆258Updated 2 years ago
- Building QA system for Stanford Question Answering Dataset☆247Updated 7 years ago
- CNN for intent classification task in a Chatbot☆101Updated 5 years ago
- Neural Machine Translation using word level seq2seq model and embeddings☆38Updated 7 years ago
- Collection of Notebooks for Natural Language Processing with PyTorch☆30Updated 6 years ago
- code and supplementary materials for a series of Medium articles about the BERT model☆77Updated 2 years ago
- ☆31Updated 7 years ago