sakuranew / attention-pytorchLinks
pytorch实现的基于attention is all your need提出的Q,K,V的attention模板和派生的attention实现。
☆21Updated 5 years ago
Alternatives and similar repositories for attention-pytorch
Users that are interested in attention-pytorch are comparing it to the libraries listed below
Sorting:
- some basic deep learning models/method for nlp, text classification.☆79Updated 5 years ago
- Pytorch Implementation of Attention-Based BiLSTM for Relation Extraction ("Attention-Based Bidirectional Long Short-Term Memory Networks …☆45Updated last year
- ☆167Updated 6 years ago
- BiLSTM 加普通Attention中文文本多分类Pytorch实现☆33Updated 4 years ago
- 文本分类, 双向lstm + attention 算法☆90Updated 5 years ago
- 在sts数据 集上用多头注意力机制上进行测试。 pytorch torchtext 代码简练,非常适合新手了解多头注意力机制的运作。不想transformer牵扯很多层 multi-head attention + one layer linear☆18Updated 2 weeks ago
- A pytorch implementation of Fairseq Convolutional Sequence to Sequence Learning(Gehring et al. 2017)☆45Updated 6 years ago
- document classification using LSTM + self attention☆113Updated 5 years ago
- Code for TKDE paper "Learning Relation Prototype from Unlabeled Texts for Long-tail Relation Extraction"☆10Updated last year
- 深度学习相关的实例☆40Updated 5 years ago
- star_transformer pytorch☆27Updated 5 years ago
- Text classification NLP pipeline with transformers☆26Updated last week
- The code for "Does Head Label Help for Long-Tailed Multi-Label Text Classific"☆29Updated 4 years ago
- 疫情期间网民情绪识别代码,包含lstm,bert,xlnet,robert,最高f1为0.725 部署在Google colab☆44Updated 5 years ago
- 电商评论情感分类☆15Updated 5 years ago
- A quick walk-through of the innards of LSTMs and a naive implementation of the Mogrifier LSTM paper in PyTorch☆78Updated 5 years ago
- Pytorch implementation of Neural Machine Translation with seq2seq and attention (en-zh)☆41Updated 6 years ago
- 主要是实现nlp常用网络以及结果比较,各模型的优劣势,如:FastText,TextCNN,TextRNN,TextRCNN,BiLSTM,Seq2seq,BERT,Transformer,ELMo以及Attention机制等等。☆46Updated 6 years ago
- ☆275Updated 3 years ago
- Official implementation of AAAI-21 paper "Label Confusion Learning to Enhance Text Classification Models"☆118Updated 2 years ago
- A pytorch implementation of Capsule Network.☆98Updated last year
- 使用分层注意力机制 HAN + 多任务学习 解决 AI Challenger 细粒度用户评论情感分析 。https://challenger.ai/competition/fsauor2018☆58Updated 6 years ago
- Transformer/Transformer-XL/R-Transformer examples and explanations☆26Updated 3 years ago
- Exploration of BERT-BiLSTM models with Layer Aggregation (attention-based and capsule-routing-based) and Hidden-State Aggregation (attent…☆25Updated 5 years ago
- MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification☆24Updated 5 years ago
- Use Bert-CNN-Capsule for text classification☆10Updated 6 years ago
- (minimal implementation) BiLSTM-Attention for Relation Classification☆31Updated 8 months ago
- This is a repository for Multi-task learning with toy data in Pytorch and Tensorflow☆136Updated 6 years ago
- Named Entity Recognition (NER) with different combinations of BiGRU, Self-Attention and CRF☆63Updated 4 years ago
- Feature Projection for Improved Text Classification☆45Updated 5 years ago