HenryYuxuanWang / CharRNN
文本自动生成项目Char-RNN
☆16Updated 6 years ago
Alternatives and similar repositories for CharRNN:
Users that are interested in CharRNN are comparing it to the libraries listed below
- 这是一个用于解决生成在生成任务中(翻译,复述等等),多样性不足问题的模型。☆45Updated 5 years ago
- 关于文本分类的许多方法,主要涉及到TextCNN,TextRNN, LEAM, Transformer,Attention, fasttext, HAN等☆75Updated 6 years ago
- NLP Predtrained Embeddings, Models and Datasets Collections(NLP_PEMDC). The collection will keep updating.☆64Updated 5 years ago
- 使用BERT做文本相似度☆64Updated 5 years ago
- Byte Cup 2018国际机器学习竞赛 23 名(水滴队)代码☆46Updated 6 years ago
- TensorFlow code and pre-trained models for BERT☆58Updated 3 years ago
- 2019百度语言与智能技术竞赛信息抽取赛代5名代码☆69Updated 5 years ago
- biLSTM_CRF 中文分词☆34Updated 6 years ago
- CCF-BDCI大数据与计算智能大赛-互联网金融新实体发现-9th☆54Updated 5 years ago
- 基于gensim模块的中文句子相似度计算☆53Updated 6 years ago
- 2019语言与智能技术竞赛-基于知识图谱的主动聊天☆115Updated 5 years ago
- siamese dssm sentence_similarity sentece_similarity_rank tensorflow☆60Updated 6 years ago
- datagrand 2019 information extraction competition rank9☆130Updated 5 years ago
- Bert中文文本分类☆41Updated 5 years ago
- NLP related tasks, including text classification, sequence annotation, text relations, machine translation and other tasks.☆67Updated 5 years ago
- transformer crf 命名实体识别☆106Updated 6 years ago
- 中文语料 Bert finetune(Fine-tune Chinese for BERT)☆81Updated 6 years ago
- 微调预训练语言模型(BERT、Roberta、XLBert等),用于计算两个文本之间的相似度(通过句子对分类任务转换),适用于中文文本☆89Updated 4 years ago
- 2018-JDDC大赛第4名的解决方案☆237Updated 6 years ago
- bert 词向量 句向量生成☆13Updated 5 years ago
- 使用ALBERT预训练模型,用于识别文本中的时间,同时验证模型的预测耗时是否有显著提升。☆56Updated 5 years ago
- use ELMo in chinese environment☆104Updated 6 years ago
- bert分类, classify, classifier. TensorFlow code and pre-trained models for BERT☆26Updated 5 years ago
- pytorch用Textcnn-bilstm-crf模型实现命名实体识别☆41Updated 6 years ago
- ☆91Updated 6 years ago
- CCL2018客服领域用户意图分类冠军1st方案☆148Updated 2 years ago
- Relation Extraction 中文关系提取☆72Updated 6 years ago
- 中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果☆98Updated 5 years ago
- automatic event extract☆47Updated 6 years ago
- 这是一个seq2seq模型,编码器是bert,解码器是transformer的解码器,可用于自然语言处理中文本生成领域的任务☆71Updated 5 years ago