avplayer / wordmakerLinks
auto generate chinese words in huge text.
☆24Updated 10 years ago
Alternatives and similar repositories for wordmaker
Users that are interested in wordmaker are comparing it to the libraries listed below
Sorting:
- auto generate chinese words in huge text.☆91Updated 10 years ago
- yaha☆266Updated 6 years ago
- 基于深度学习的中文分词尝试☆84Updated 9 years ago
- 中文自然语言处理工具包☆86Updated 10 years ago
- 把之前 hanLP-python-flask 裡面的 hanLP 單獨分出來☆59Updated 7 years ago
- a chinese segment base on crf☆233Updated 6 years ago
- Pure python NLP toolkit☆55Updated 9 years ago
- Chinese word segmentation module of LTP☆46Updated 9 years ago
- Chinese Tokenizer; New words Finder. 中文三段式机械分词算法; 未登录新词发现算法☆95Updated 8 years ago
- Chinese Natural Language Processing tools and examples☆162Updated 9 years ago
- 《知网》中文词语语义相似度算法☆41Updated 12 years ago
- Yet another Chinese word segmentation package based on character-based tagging heuristics and CRF algorithm☆245Updated 12 years ago
- deepThought is a conversational smart bot☆109Updated 8 years ago
- Lean Semantic Web tutorials☆128Updated 11 years ago
- This open source project is a python wrapper for NLPIR.☆82Updated 10 years ago
- Word segmentation using neural networks based on package https://github.com/SUTDNLP/LibN3L☆23Updated 9 years ago
- Details of paper cw2vec☆82Updated 7 years ago
- Source codes and corpora of paper "Iterated Dilated Convolutions for Chinese Word Segmentation"☆135Updated 4 years ago
- Implementation of paper: Deng K, Bol P K, Li K J, et al. On the unsupervised analysis of domain-specific Chinese texts[J]. Proceedings of…☆77Updated 8 years ago
- Neural conversational model in Torch☆89Updated 8 years ago
- Clone of "A Good Part-of-Speech Tagger in about 200 Lines of Python" by Matthew Honnibal☆48Updated 8 years ago
- 新词发现算法(NewWordDetection)☆92Updated 4 years ago
- NLTK Source☆31Updated 10 years ago
- tyccl(同义词词林) is a ruby gem that provides friendly functions to analyse similarity between Chinese Words.☆46Updated 11 years ago
- Chinese Words Segment Library based on HMM model☆166Updated 10 years ago
- BosonNLP HTTP API 封装库(SDK)☆163Updated 6 years ago
- 对中文分词jieba (python版)的注解☆92Updated 6 years ago
- 中文短文句相似读☆137Updated 7 years ago
- 利用深度学习实现中文分词☆61Updated 7 years ago
- Chinese Word Similarity Computation based on HowNet☆27Updated 7 years ago