xuhaiming1996 / Densely-Connected-Bidirectional-LSTM
这是一个denseNet_bilstm
☆9Updated 5 years ago
Related projects ⓘ
Alternatives and complementary repositories for Densely-Connected-Bidirectional-LSTM
- reformer-pytorch中文版本,简单高效的生成模型。类似GPT2的效果☆16Updated last year
- Read 'Dice Loss for Data-imbalanced NLP Tasks' this evening and try to implement it☆15Updated 3 years ago
- 精简版NEZHA模型权重☆21Updated 3 years ago
- 全球人工智能技术创新大赛-赛道三:小布助手对话短文本语义匹配☆11Updated 3 years ago
- ☆44Updated 2 years ago
- Relation Classification via Convolutional Deep Neural Network☆13Updated 6 years ago
- Extractive Text Summarization Using Deep Learning: Single document summarization using RBM and Fuzzy logic☆9Updated 5 years ago
- CHIP 2019平安医疗科技疾病问答迁移学习比赛baseline,rank7☆26Updated 5 years ago
- A span-based joint named entity recognition (NER) and relation extraction model.☆10Updated 4 years ago
- Code for "A Novel Aspect-Guided Deep Transition Model for Aspect Based Sentiment Analysis." on EMNLP 2019.☆21Updated 4 years ago
- 西班牙短文本匹配比赛,初赛8/1027,复赛5/1027☆19Updated 6 years ago
- 基于回译增强数据,目前整合了百度、有道、谷歌(需翻墙)翻译。☆20Updated 4 years ago
- A visualizer to display attention weights on text☆23Updated 5 years ago
- papers☆18Updated 7 years ago
- Multi-cell compositional LSTM for NER domain adaptation, code for ACL 2020 paper☆30Updated 3 years ago
- 2020智源-京东多模态对话挑战大赛第二名方案☆34Updated last year
- 蚂蚁金融自然语言处理竞赛。☆10Updated 6 years ago
- An easy-to-use toolkit for natural language processing tasks.☆26Updated 5 years ago
- AI Challenger 2018 阅读理解赛道代码分享☆20Updated 5 years ago
- 中文领域的多模态Bert☆46Updated 4 years ago
- ☆22Updated 2 years ago
- Utilize BERT model for multi task including ABSA (aspect based sentiment analysis) task and AE (Aspect Extraction) task☆10Updated 5 years ago
- Seq2seqAttGeneration, an basic implementation of text generation that using seq2seq attention model to generate poem series. this project…☆17Updated 3 years ago
- 天池-新冠疫情相似句对判定大赛 大白_Rank6☆22Updated 4 years ago
- Implement attention model to LSTM using TensorFlow☆10Updated 6 years ago
- Leveraging Local and Global Patterns for Self-Attention Networks☆12Updated 5 years ago
- 基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。☆16Updated 4 years ago
- Text classification examples☆24Updated 2 years ago