zhangfangdeng / SLCVAELinks
self labeling conditional variational auto encoder
☆19Updated 6 years ago
Alternatives and similar repositories for SLCVAE
Users that are interested in SLCVAE are comparing it to the libraries listed below
Sorting:
- This is the repository for NLPCC2020 task AutoIE☆52Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆25Updated 5 years ago
- Code for ACL 2019 paper on Data-to-text Generation with Entity Modeling☆73Updated 3 years ago
- 2020语言与智能技术竞赛:面向推荐的对话任务☆52Updated 4 years ago
- lasertagger-chinese;lasertagger中文学习案例,案例数据,注释,shell运行☆76Updated 2 years ago
- 中文生成式预训练模型☆99Updated 5 years ago
- 2019 语言与智能技术竞赛-知识驱动对话 B榜第5名源码和模型☆27Updated 6 years ago
- CLUE baseline pytorch CLUE的pytorch版本基线☆75Updated 5 years ago
- ☆25Updated 6 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- ☆76Updated 6 years ago
- Source code for paper Neural Architectures for Nested NER through Linearization☆91Updated 6 years ago
- Code for NLPCC2017 paper "Large-scale Simple Question Generation by Template-based Seq2seq Learning"☆84Updated 7 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆95Updated 5 years ago
- chinese coreference resolution, implementation of paper 1606.01323v2 (stanford)☆28Updated 7 years ago
- Code for ACL 2019 : Entity-Relation Extraction as Multi-Turn Question Answering☆75Updated 2 years ago
- Dataset and code for ``Long and Diverse Text Generation with Planning-based Hierarchical Variational Model (EMNLP 2019)``☆120Updated 5 years ago
- Dataset and Baseline for SMP-MCC2020☆23Updated 2 years ago
- ☆23Updated 4 years ago
- 基于百度webqa与dureader数据集训练的Albert Large QA模型☆77Updated 5 years ago
- Codes for our paper at EMNLP2019☆36Updated 5 years ago
- 24*2个预训练的小型BERT模型,NLPer炼丹利器☆51Updated 5 years ago
- Modify Chinese text, modified on LaserTagger Model. I name it "文本手术刀".目前,本项目实现了一个文本复述任务,用于NLP语料的数据增强。☆214Updated 2 years ago
- ☆90Updated 5 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆66Updated 5 years ago
- ☆61Updated 5 years ago
- 这是一个seq2seq模型,编码器是bert,解码器是transformer的解码器,可用于自然语言处理中文本生成领域的任务☆75Updated 6 years ago
- 2019语言与智能技术竞赛-基于知识图谱的主动聊天☆115Updated 6 years ago
- ☆78Updated 6 years ago
- ☆29Updated 6 years ago