DengYangyong / word2vecLinks
用gensim和TensorFlow训练word2vec中文词向量
☆11Updated 6 years ago
Alternatives and similar repositories for word2vec
Users that are interested in word2vec are comparing it to the libraries listed below
Sorting:
- bert实现中文关系抽取☆17Updated 2 years ago
- 事件知识图谱构建相关的论文, 包含事件抽取、事件关系识别等任务☆82Updated 2 years ago
- 中文实体关系联合抽取baseline☆12Updated 6 years ago
- ☆34Updated 4 years ago
- 中文文本预处理,Word2Vec训练计算文本相似度。☆45Updated 6 years ago
- pytorch implementation of multi-label text classification, includes kinds of models and pretrained. Especially for Chinese preprocessing.☆75Updated 5 years ago
- 迭代膨胀卷积命名实体抽取☆45Updated 5 years ago
- 使用R-BERT模型对人物关系模型进行分类,效果有显著提升。☆24Updated 2 years ago
- 多标签文本分类☆54Updated 6 years ago
- 本项目采用Keras和Keras-bert实现文本多标签分类任务,对BERT进行微调。☆67Updated 4 years ago
- bilstm _Attention_crf☆37Updated 6 years ago
- code for ACL2020:《FLAT: Chinese NER Using Flat-Lattice Transformer》 我注释&修改&添加了部分源码,使得大家更容易复现这个代码。☆56Updated 4 years ago
- 本项目是NLP领域一些任务的基准模型实现,包括文本分类、命名实体识别、实体关系抽取、NL2SQL、CKBQA以及BERT的各种下游任务应用。☆47Updated 4 years ago
- Pytorch进行长文本分类。这里用到的网络有:FastText、TextCNN、TextRNN、TextRCNN、Transformer☆48Updated 5 years ago
- 记录自己用的BILSTM-CRF、ELMo、BERT等来做NER任务的代码。☆26Updated 5 years ago
- 基于Bert模型的关系抽取和实体识别、Entity Extraction and Relation Extract using Bert☆13Updated 5 years ago
- 利用bert预训练模型生成句向量或词向量☆27Updated 4 years ago
- 篇章级事件抽取☆20Updated 4 years ago
- TF-IDF+Word2vec做文本相似度计算,最好是长文本☆24Updated 5 years ago
- 事件抽取相关算法汇总☆125Updated 5 years ago
- 在 Google BERT Fine-tuning基础上,利用cnn/rnn进行中文文本的分类☆19Updated 5 years ago
- 基于ALBERT-BiLSTM-CRF的中文命名实体识别☆11Updated 4 years ago
- TensorFlow code and pre-trained models for BERT☆58Updated 4 years ago
- 在bert模型的pre_training基础上进行text_cnn文本分类☆78Updated 5 years ago
- cnn bilstm crf 作中文命名实体识别☆13Updated 4 years ago
- 嵌套命名实体识别 Nested NER☆20Updated 3 years ago
- ☆42Updated 2 years ago
- multi-label-classification-4-event-type☆136Updated 2 years ago
- NER任务SOTA模型BERT_MRC☆61Updated last year
- NLP Predtrained Embeddings, Models and Datasets Collections(NLP_PEMDC). The collection will keep updating.☆64Updated 5 years ago