chenbjin / schoolCardLinks
中国高校更名记录合并
☆13Updated 10 years ago
Alternatives and similar repositories for schoolCard
Users that are interested in schoolCard are comparing it to the libraries listed below
Sorting:
- 把之前 hanLP-python-flask 裡面的 hanLP 單獨分出來☆59Updated 8 years ago
- Pure python NLP toolkit☆55Updated 10 years ago
- A Python package for pullword.com☆86Updated 5 years ago
- BosonNLP HTTP API 封装库(SDK)☆163Updated 7 years ago
- A readability parser which can extract title, content, images from html pages☆85Updated 5 years ago
- a chinese segment base on crf☆234Updated 7 years ago
- yaha☆266Updated 7 years ago
- a text analyzing (match, rewrite, extract) engine (python edition)☆80Updated 8 years ago
- Chinese Synonym Library☆123Updated 7 years ago
- ☆61Updated last year
- Chinese Natural Language Processing tools and examples☆162Updated 9 years ago
- convert sogou input dict ( .scel file ) to mmseg(coreseek) dict☆97Updated 12 years ago
- deepThought is a conversational smart bot☆109Updated 9 years ago
- APIs of text mining☆34Updated 9 years ago
- A flexible web crawler based on Scrapy for fetching most of Ajax or other various types of web pages. Easy to use: To customize a new web…☆45Updated 10 years ago
- this is my presentaion area .个人演讲稿展示区,主要展示一些平时的个人演讲稿或者心得之类的,☆57Updated 5 years ago
- 基于 深度学习的中文分词尝试☆84Updated 10 years ago
- tools for chinese word segmentation and pos tagging written in python☆39Updated 12 years ago
- auto generate chinese words in huge text.☆92Updated 11 years ago
- A Chinese Words Segmentation Tool Based on Bayes Model☆79Updated 12 years ago
- 中文分词程序,可以在没有中文语料库的情况下通过相关性将一段文本中的中文词汇抽取出来☆56Updated 12 years ago
- Chinese Word Similarity Computation based on HowNet☆27Updated 8 years ago
- 微博主题搜索分析,上海租房☆115Updated 9 years ago
- Tobe Algorithm Manual☆48Updated 5 years ago
- 复旦的中文自然语言工具包☆71Updated 8 years ago
- 简单文本共现网络提取示例☆82Updated 9 years ago
- python-segment是一个纯python实现的分词库,他的目标是提供一个可用的,完善的分词系统和训练环境,包括一个可用的词典。☆16Updated 12 years ago
- a demo site for jieba☆111Updated 12 years ago
- Chinese word segmentation algorithm based on entropy(基于熵,无需语料库的中文分词)☆11Updated 7 years ago
- [译] Python 自然语言处理 中文第二版☆63Updated 7 years ago