SamLynnEvans / LSTM_with_attentionLinks
Seq2seq using LSTM with attention from Luong et al
☆10Updated 6 years ago
Alternatives and similar repositories for LSTM_with_attention
Users that are interested in LSTM_with_attention are comparing it to the libraries listed below
Sorting:
- My 1st place solution at WSDM 2019 cup for fake news classification☆44Updated 5 years ago
- GPU-accelerated PyTorch implementation of Zero-shot User Intent Detection via Capsule Neural Networks☆16Updated 6 years ago
- Position embedding layers in Keras☆58Updated 3 years ago
- Knowledge Distillation For Transformer Language Models☆52Updated last year
- This code extracts context embedding from sentence☆26Updated 7 years ago
- Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2☆64Updated 2 years ago
- Sequence to Sequence Models in PyTorch☆44Updated last year
- Comparing Text Classification results using BERT embedding and ULMFIT embedding☆65Updated 6 years ago
- Using BERT For Classifying Documents with Long Texts, check my latest post: https://armandolivares.tech/☆41Updated 5 years ago
- Tensorflow Implementation of Densely Connected Bidirectional LSTM with Applications to Sentence Classification☆47Updated 7 years ago
- Implementation of paper "Learning to Encode Text as Human-Readable Summaries using GAN"☆66Updated 5 years ago
- A Neural Attention Model for Abstractive Sentence Summarization in DyNet☆19Updated 7 years ago
- Reproducing Character-Level-Language-Modeling with Deeper Self-Attention in PyTorch☆61Updated 6 years ago
- Scripts to train a bidirectional LSTM with knowledge distillation from BERT☆158Updated 5 years ago
- QnA bot powered by CoQA + BERT☆39Updated 2 years ago
- Multilingual Neural Machine Translation using Transformers with Conditional Normalization.☆18Updated 2 years ago
- Tensorflow Implements Chinese Word Segment use LSTM+CRF and Dilated CNN+CRF☆15Updated 7 years ago
- BERT Extension in TensorFlow☆30Updated 5 years ago
- Semi Supervised Learning for Text-Classification☆83Updated 6 years ago
- Use the famous language model, xlnet, to do sequence tagging/ sequence labelling/ named entity recognition(NER) / noun extraction;☆18Updated 5 years ago
- LM, ULMFit et al.☆46Updated 5 years ago
- ☆53Updated 4 years ago
- Formate converter from one type of qa task datasets to another type☆39Updated 6 years ago
- BERT for joint intent classification and slot filling☆39Updated 5 years ago
- Transformer-XL with checkpoint loader☆68Updated 3 years ago
- seq2seq attention in keras☆39Updated 6 years ago
- CapsNet for NLP☆67Updated 6 years ago
- keras attentional bi-LSTM-CRF for Joint NLU (slot-filling and intent detection) with ATIS☆124Updated 6 years ago
- Visualize BERT's self-attention layers on text classification tasks☆47Updated 6 years ago
- ☆32Updated 6 years ago