mttk / rnn-classifierLinks
Minimal RNN classifier with self-attention in Pytorch
☆150Updated 3 years ago
Alternatives and similar repositories for rnn-classifier
Users that are interested in rnn-classifier are comparing it to the libraries listed below
Sorting:
- PyTorch implementation of batched bi-RNN encoder and attention-decoder.☆280Updated 6 years ago
- An LSTM in PyTorch with best practices (weight dropout, forget bias, etc.) built-in. Fully compatible with PyTorch LSTM.☆134Updated 5 years ago
- Hierarchical Attention Networks for Document Classification in PyTorch☆36Updated 6 years ago
- Code for "Strong Baselines for Neural Semi-supervised Learning under Domain Shift" (Ruder & Plank, 2018 ACL)☆61Updated 2 years ago
- A Structured Self-attentive Sentence Embedding☆494Updated 5 years ago
- PyTorch DataLoader for seq2seq☆85Updated 6 years ago
- Pytorch implementation of R-Transformer. Some parts of the code are adapted from the implementation of TCN and Transformer.☆230Updated 6 years ago
- The Noise Contrastive Estimation for softmax output written in Pytorch☆319Updated 5 years ago
- Implementation of Universal Transformer in Pytorch☆261Updated 6 years ago
- A complete pytorch implementation of skip-gram☆191Updated 7 years ago
- Two-Layer Hierarchical Softmax Implementation for PyTorch☆69Updated 4 years ago
- text convolution-deconvolution auto-encoder model in PyTorch☆55Updated 7 years ago
- Tensorflow Implementation of Variational Attention for Sequence to Sequence Models (COLING 2018)☆71Updated 5 years ago
- Semi Supervised Learning for Text-Classification☆83Updated 6 years ago
- Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch☆703Updated 4 years ago
- Word Embedding + LSTM + FC☆161Updated last year
- DiSAN: Directional Self-Attention Network for RNN/CNN-Free Language Understanding☆26Updated 7 years ago
- PyTorch implementations of LSTM Variants (Dropout + Layer Norm)☆137Updated 4 years ago
- ☆209Updated last year
- Text classification based on LSTM on R8 dataset for pytorch implementation☆141Updated 8 years ago
- This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention…☆125Updated 3 years ago
- Bayesian Deep Active Learning for Natural Language Processing Tasks☆147Updated 6 years ago
- Code for the paper "Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks"☆580Updated 5 years ago
- pytorch implementation of Attention is all you need