17zuoye / detdupLinks
Detect duplicated items。内容排重框架。
☆11Updated 10 years ago
Alternatives and similar repositories for detdup
Users that are interested in detdup are comparing it to the libraries listed below
Sorting:
- tyccl(同义词词林) is a ruby gem that provides friendly functions to analyse similarity between Chinese Words.☆46Updated 11 years ago
- A Python package for pullword.com☆86Updated 5 years ago
- Yet another Chinese word segmentation package based on character-based tagging heuristics and CRF algorithm☆245Updated 12 years ago
- Pure python NLP toolkit☆55Updated 9 years ago
- Chinese Words Segment Library based on HMM model☆166Updated 11 years ago
- ☆68Updated 10 years ago
- 中文自然语言处理工具包☆86Updated 10 years ago
- yaha☆267Updated 7 years ago
- A Chinese Words Segmentation Tool Based on Bayes Model☆79Updated 12 years ago
- auto generate chinese words in huge text.☆92Updated 11 years ago
- Distributed text analysis suite based on Celery☆96Updated 3 years ago
- a chinese segment base on crf☆234Updated 6 years ago
- ☆99Updated 11 years ago
- Detect duplicated items framework。内容排重框架。☆12Updated 10 years ago
- a text analyzing (match, rewrite, extract) engine (python edition)☆80Updated 8 years ago
- 复旦的中文自然语言工具包☆72Updated 8 years ago
- BosonNLP HTTP API 封装库(SDK)☆163Updated 7 years ago
- convert sogou input dict ( .scel file ) to mmseg(coreseek) dict☆96Updated 12 years ago
- stan-cn-nlp: an API wrapper based on Stanford NLP packages for the convenience of Chinese users☆57Updated 9 years ago
- 把之前 hanLP-python-flask 裡面的 hanLP 單獨分出來☆59Updated 8 years ago
- A distributed Sina Weibo Search spider base on Scrapy and Redis.☆145Updated 12 years ago
- Chinese Natural Language Processing tools and examples☆162Updated 9 years ago
- ☆56Updated 9 years ago
- autocomplete-redis is a quora like automatic autocompletion based on redis.☆204Updated 12 years ago
- adapters for solr: jieba, fudan nlp, stanford nlp☆74Updated 8 years ago
- rmmseg-cpp with Python interface☆189Updated 11 years ago
- 基于行块分布函数的通用网页正文(及图片)抽取 - Python版本☆115Updated 9 years ago
- Chinese Tokenizer; New words Finder. 中文三段式机械分词算法; 未登录新词发现算法☆95Updated 9 years ago
- A spectrum analysis based music finder☆107Updated 10 years ago
- deepThought is a conversational smart bot☆109Updated 9 years ago