The-Orizon / nlputilsLinks
Utility scripts or libraries for various Natural Language Processing tasks.
☆39Updated 3 years ago
Alternatives and similar repositories for nlputils
Users that are interested in nlputils are comparing it to the libraries listed below
Sorting:
- Corpus creator for Chinese Wikipedia☆41Updated 3 years ago
- A tool for ancient Chinese segmentation.☆53Updated 6 years ago
- classic Chinese punctuate experiment with keras using daizhige(殆知阁古代文献藏书) dataset☆35Updated 2 years ago
- an open solution for collecting n-gram Chinese lexicon and n-gram statistics☆74Updated 9 years ago
- ☆93Updated last week
- NanGe - A Rule-based Chinese-English Machine Translation System☆20Updated 7 years ago
- An open-source classical Chinese information processing toolkit developed by Tsinghua Natural Language Processing Group☆51Updated 6 years ago
- (WIP) My humble contribution to the democratization of the Chinese NLP technology☆46Updated 6 years ago
- THU Chinese Keyphrase Extraction Toolkit☆124Updated 7 years ago
- Chinese Tokenizer module for Python☆15Updated 6 years ago
- Hanzi Converter for Traditional and Simplified Chinese☆188Updated 5 years ago
- 大规模中文语料☆42Updated 5 years ago
- ☆66Updated 8 years ago
- This is a corpus of Chinese abbreviation, including negative full forms.☆196Updated 3 years ago
- Chinese word segmentation module of LTP☆46Updated 9 years ago
- ZPar statistical parser. Universal language support (depending on the availability of training data), with language-specific features for…☆135Updated 8 years ago
- A Chinese Cloze-style RC Dataset: People's Daily & Children's Fairy Tale (CFT)☆171Updated 6 years ago
- Chinese stopwords collection☆135Updated 5 years ago
- Yet another Chinese word segmentation package based on character-based tagging heuristics and CRF algorithm☆245Updated 12 years ago
- Chinese word segmentation algorithm based on entropy(基于熵,无需语料库的中文分词)☆11Updated 7 years ago
- Interpoetry: Generating Classical Chinese Poems from Vernacular Chinese.☆43Updated 5 years ago
- 一个轻量且功能全面的中文分词器,帮助学生了解分词器的工作原理。MicroTokenizer: A lightweight Chinese tokenizer designed for educational and research purposes. Provides a…☆153Updated 7 months ago
- Chinese version of Dr chen's PhD thesis. 这里是对陈丹琦的博士毕业论文的中文翻译版本。https://chendq-thesis-zh.readthedocs.io/en/latest/☆35Updated 5 years ago
- 中文分词工具评估☆61Updated 2 years ago
- ☆6Updated 7 years ago
- chinese anti semantic word search interface based on dict crawled from online resources, ChineseAntiword,针对中文词语的反义词查询接口☆59Updated 6 years ago
- Chinese Word Similarity Computation based on HowNet☆27Updated 7 years ago
- Berserker - BERt chineSE woRd toKenizER☆16Updated 6 years ago
- A dataset contains 37 million douban dushu comments☆61Updated 6 years ago
- 香侬科技(北京香侬慧语科技有限责任公司)知乎爆料备份☆41Updated 5 years ago