JackHCC / Chinese-TokenizationLinks
利用传统方法(N-gram,HMM等)、神经网络方法(CNN,LSTM等)和预训练方法(Bert等)的中文分词任务实现【The word segmentation task is realized by using traditional methods (n-gram, HMM, etc.), neural network methods (CNN, LSTM, etc.) and pre training methods (Bert, etc.)】
☆35Updated 3 years ago
Alternatives and similar repositories for Chinese-Tokenization
Users that are interested in Chinese-Tokenization are comparing it to the libraries listed below
Sorting:
- smp ewect code☆77Updated 4 years ago
- 疫情期间网民情绪识别比赛分享+top1~3解决方案☆50Updated 4 years ago
- 利用huggingface实现文本分类