hlamba28 / NMT-with-Attention-MechanismLinks
In this project I implement Neural Machine Translation using Attention mechanism. The code is written using the TensorFlow library in Python. I have used TensorFlow functionalities like tf.data.Dataset to manage the input pipeline, Eager Execution and Model sub classing to create the model architecture.
☆24Updated 6 years ago
Alternatives and similar repositories for NMT-with-Attention-Mechanism
Users that are interested in NMT-with-Attention-Mechanism are comparing it to the libraries listed below
Sorting:
- Transfer Learning for NLP Tasks☆55Updated 6 years ago
- A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python☆102Updated 6 years ago
- This is where I put all my work in Natural Language Processing☆96Updated 4 years ago
- Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.☆167Updated 5 years ago
- Keras + Universal Sentence Encoder = Transfer Learning for text data☆33Updated 7 years ago
- The implementation of text classification using character level convoultion neural networks using Keras☆149Updated 2 years ago
- ☆23Updated 7 years ago
- A bidirectional LSTM with attention for multiclass/multilabel text classification.☆173Updated last year
- This is our team's solution report, which achieves top 10% (305/3307) in this competition.☆59Updated 8 years ago
- LM, ULMFit et al.☆46Updated 5 years ago
- ☆31Updated 7 years ago
- Kaggle: Quora Insincere Questions Classification - detect toxic content to improve online conversations☆36Updated 6 years ago
- Sequence to Sequence and attention from scratch using Tensorflow☆29Updated 8 years ago
- An example on how to train supervised classifiers for multi-label text classification using sklearn pipelines☆110Updated 7 years ago
- Tensorflow Implementation of Recurrent Neural Network (Vanilla, LSTM, GRU) for Text Classification☆118Updated 7 years ago
- Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).☆43Updated 4 years ago
- Collection of Notebooks for Natural Language Processing with PyTorch☆31Updated 6 years ago
- Example showing generalisation☆69Updated 5 years ago
- An Attention Layer in Keras☆43Updated 6 years ago
- Translating English sentences to Marathi using Neural Machine Translation☆38Updated 6 years ago
- Creating word embeddings from scratch and visualize them on TensorBoard. Using trained embeddings in Keras.☆34Updated 5 years ago
- Bidirectional LSTM + CRF (Conditional Random Fields) in Tensorflow☆56Updated 7 years ago
- Text Generation Using A Variational Autoencoder☆110Updated 8 years ago
- All my experiments with AI and ML☆119Updated 7 years ago
- Character-level CNN for text classification☆57Updated 3 years ago
- CNN for intent classification task in a Chatbot☆102Updated 6 years ago
- For those who already have some basic idea about deep learning, and preferably are familiar with PyTorch.☆47Updated 7 years ago
- Text Generation using Bidirectional LSTM and Doc2Vec models☆62Updated 6 years ago
- Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2☆64Updated 2 years ago
- Experiments of ELMo that deep contextualized word representation in Keras with Tensorflow Hub.☆14Updated 7 years ago