zhigangc / friso
a Chinese tokenizer
☆17Updated 11 years ago
Related projects ⓘ
Alternatives and complementary repositories for friso
- Chinese word segmentation module of LTP☆46Updated 9 years ago
- Chinese Word Similarity Computation based on HowNet☆27Updated 6 years ago
- ZPar statistical parser. Universal language support (depending on the availability of training data), with language-specific features for…☆134Updated 8 years ago
- mmseg 分词算法c++实现☆33Updated 8 years ago
- ctbparser是一个用C++语言实现的开源的中文处理工具包(GBK编码),用于分词、词性标注、依存句法分析,采用的是中文宾州树库(Chinese Tree Bank, CTB)标准。☆12Updated 10 years ago
- Chinese processing☆36Updated 10 years ago
- wrap cppjieba by swig.☆17Updated 6 years ago
- C++ headers(hpp) library with Python style.☆129Updated 2 months ago
- NanGe - A Rule-based Chinese-English Machine Translation System☆20Updated 7 years ago
- auto generate chinese words in huge text.☆24Updated 10 years ago
- 微型中文关键词抽取服务☆53Updated 7 years ago
- Clone of "A Good Part-of-Speech Tagger in about 200 Lines of Python" by Matthew Honnibal☆49Updated 8 years ago
- auto generate chinese words in huge text.☆92Updated 10 years ago
- Chatbot Framework for Chinese based on ChatScript 基于ChatScript的中文聊天引擎☆41Updated 7 years ago
- Yet another Chinese word segmentation package based on character-based tagging heuristics and CRF algorithm☆243Updated 11 years ago
- An Efficient Lexical Analyzer for Chinese☆38Updated 5 years ago
- 中文分词软件基准测试 | Chinese tokenizer benchmark☆23Updated 6 years ago
- ☆50Updated 8 years ago
- ☆10Updated 6 years ago
- ☆23Updated 8 years ago
- minitools☆104Updated 11 years ago
- the spoken dialog system framework based on RavenClaw dialog engine☆14Updated 9 years ago
- 自然语言处理实验☆12Updated 9 years ago
- Word segmentation using neural networks based on package https://github.com/SUTDNLP/LibN3L☆23Updated 8 years ago
- CRFs based Chinese word segmentor☆19Updated 10 years ago
- 中文自然语言处理工具包☆85Updated 9 years ago
- LASSO is a parallel regression model learning system☆69Updated 10 years ago
- an open solution for collecting n-gram Chinese lexicon and n-gram statistics☆74Updated 8 years ago
- 用结巴(Jieba)轻松实现细粒度分词☆16Updated 5 years ago
- Chinese Tokenizer; New words Finder. 中文三段式机械分词算法; 未登录新词发现算法☆95Updated 8 years ago