u784799i / biLSTM_attnLinks
☆166Updated 6 years ago
Alternatives and similar repositories for biLSTM_attn
Users that are interested in biLSTM_attn are comparing it to the libraries listed below
Sorting:
- ☆275Updated 3 years ago
- Multilabel classification based on TextCNN and Attention☆77Updated 4 years ago
- ☆131Updated 4 years ago
- torchtext使用总结,从零开始逐步实现了torchtext文本预处理过程,包括截断补长,词表构建,使用预训练词向量,构建可用于PyTorch的可迭代数据等步骤。并结合Pytorch实现LSTM.☆175Updated 6 years ago
- 文本分类, 双向lstm + attention 算法☆90Updated 5 years ago
- some basic deep learning models/method for nlp, text classification.☆80Updated 5 years ago
- BiLSTM 加普通Attention中文文本多分类Pytorch实现☆33Updated 4 years ago
- Pytorch Bert Finetune in Chinese Text Classification☆219Updated last year
- 本项目主要为针对DPCNN(Deep Pyramid Convolutional Neural Networks for Text Categorization )文本分类(Text Classification)的论文复现以及基于知乎看山杯Inception的修改和复现,…☆141Updated 6 years ago
- transfromer model py keras☆33Updated 6 years ago
- Hierarchical BiLSTM CNN using Keras☆76Updated 7 years ago
- 使用pytorch搭建textCNN实现中文文本分类☆129Updated 6 years ago
- ☆135Updated 6 years ago
- 使用分层注意力机制 HAN + 多任务学习 解决 AI Challenger 细粒度用户评论情感分析 。https://challenger.ai/competition/fsauor2018☆58Updated 6 years ago
- attention-based LSTM/Dense implemented by Keras☆299Updated 7 years ago
- question classification with multi-level attention mechanism 使用多层级注意力机制和keras实现问题分类☆34Updated 6 years ago
- Use Bert-CNN-Capsule for text classification☆10Updated 6 years ago
- Lstm+Cnn 预训练词向量 文 本分类☆103Updated 6 years ago
- Use pytorch to implement word2vec☆149Updated 6 years ago
- 集成各种神经网络进行情感分类,包括CNN、LSTM、Transformer以及BERT等网络模型☆72Updated 6 years ago
- 基于Transformers的文本分类☆340Updated 3 years ago
- bert for chinese text classification☆142Updated 6 years ago
- 自注意力与文本分类☆119Updated 6 years ago
- tensorflow TxetCnn TextRNN 使用Textcnn、Textrnn对文本进行分类☆58Updated 6 years ago
- Capsule, LSTM/GRU, CNN for text class implemented by Pytorch 胶囊网络, 循环神经网络和卷积神经网络在中文文本分类中的应用☆43Updated 6 years ago
- 利用keras搭建的胶囊网络(capsule network文本分类模型,包含RNN、CNN、HAN等,其中keras_utils包含了capsule层和attention层的keras实现