outerWorld / snailLinks
framework for data mining, and c++ language used.
☆24Updated 12 years ago
Alternatives and similar repositories for snail
Users that are interested in snail are comparing it to the libraries listed below
Sorting:
- a chinese segment base on crf☆233Updated 6 years ago
- A Chinese Words Segmentation Tool Based on Bayes Model☆79Updated 12 years ago
- Chinese Words Segment Library based on HMM model☆166Updated 10 years ago
- Chinese Natural Language Processing tools and examples☆162Updated 9 years ago
- auto generate chinese words in huge text.☆91Updated 10 years ago
- Yet another Chinese word segmentation package based on character-based tagging heuristics and CRF algorithm☆245Updated 12 years ago
- Pure python NLP toolkit☆55Updated 9 years ago
- 把之前 hanLP-python-flask 裡面的 hanLP 單獨分出來☆59Updated 7 years ago
- Chinese morphological analysis with Word Segment and POS Tagging data for MeCab☆161Updated 7 years ago
- 基于深度学习的中文分词尝试☆84Updated 9 years ago
- A Python package for pullword.com☆86Updated 4 years ago
- 对中文分词jieba (python版)的注解☆92Updated 6 years ago
- Chinese Synonym Library☆123Updated 7 years ago
- Count frequent n-gram from big data with limited memory.☆59Updated 11 years ago
- deepThought is a conversational smart bot☆109Updated 8 years ago
- 中文自然语言处理工具包☆86Updated 10 years ago
- Chinese Word Segmentation using CRF++☆24Updated 10 years ago
- Lean Semantic Web tutorials☆128Updated 11 years ago
- tyccl(同义词词林) is a ruby gem that provides friendly functions to analyse similarity between Chinese Words.☆46Updated 11 years ago
- ☆99Updated 11 years ago
- Some articles written by Bao Jie☆1Updated 8 years ago
- Chinese Word Segmentation☆27Updated 11 years ago
- 中文环境突发事件语料库(Chinese Environment Emergency Corpus)-上海大学-语义智能实验室☆46Updated 9 years ago
- ☆29Updated 3 years ago
- Tensorflow based Neural Conversation Models☆29Updated 7 years ago
- a demo site for jieba☆111Updated 11 years ago
- 中文分词程序,可以在没有中文语料库的情况下通过相关性将一段文本中的中文词汇抽取出来☆56Updated 12 years ago
- Chinese Tokenizer; New words Finder. 中文三段式机械分词算法; 未登录新词发现算法☆95Updated 8 years ago
- Word segmentation using neural networks based on package https://github.com/SUTDNLP/LibN3L☆23Updated 9 years ago
- 利用深度学习实现中文分词☆62Updated 7 years ago