ericperfect / libtorch_tokenizerLinks
BERT Tokenizer in C++
☆79Updated 4 years ago
Alternatives and similar repositories for libtorch_tokenizer
Users that are interested in libtorch_tokenizer are comparing it to the libraries listed below
Sorting:
- 高性能文本 Tokenizer 库☆32Updated last year
- C++ model train&inference framework☆222Updated 6 years ago
- implement bert in pure c++☆36Updated 5 years ago
- Minimal example of using a traced huggingface transformers model with libtorch☆35Updated 5 years ago
- Chinese MobileBERT(中文MobileBERT模型)☆98Updated 3 years ago
- lightweighted deep learning inference service framework☆39Updated 4 years ago
- ☆90Updated 2 years ago
- python | 高效使用统计语言模型kenlm:新词发现、分词、智能纠错等☆169Updated 6 years ago
- 不用tensorflow estimator,分别采用字mask和wwm mask在中文领域内finetune bert模型☆24Updated 5 years ago
- transformer tokenizers (e.g. BERT tokenizer) in C++ (WIP)☆18Updated 3 years ago
- 大规模中文语料☆44Updated 6 years ago
- 离线端阅读理解应用 QA for mobile, Android & iPhone☆60Updated 3 years ago
- TensorRT☆11Updated 5 years ago
- 基于seq2edit (Gector) 的中文文本纠错。☆29Updated 3 years ago
- 中文版unilm预训练模型☆82Updated 4 years ago
- 对话改写介绍文章☆98Updated 2 years ago
- RoFormer升级版☆154Updated 3 years ago
- ☆102Updated 5 years ago
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆49Updated 2 years ago
- 一个非常高效的字符串匹配工具,支持正向/反向最大匹配分词和多模式字符串精确匹配☆16Updated 2 years ago
- A more efficient GLM implementation!☆54Updated 2 years ago
- 基于bert进行中文文本纠错☆239Updated 2 years ago
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆134Updated 2 years ago
- CTC2021-中文文本纠错大赛的SOTA方案及在线演示☆73Updated 2 years ago
- a Fast, Flexible, Extensible and Easy-to-use NLP Large-scale Pretraining and Multi-task Learning Framework.☆184Updated 4 years ago
- sentence-transformers to onnx 让sbert模型推理效率更快☆168Updated 3 years ago
- 一个基于预训练的句向量生成工具☆138Updated 2 years ago
- 基于深度学习识别THCHS30数据集☆14Updated 4 years ago
- intent detection and slot filling 意图识别与槽填充联合模型☆42Updated 3 years ago
- 汉字字符特征提取工具,可以提取出字符中的字音(声母、韵母、声调)、字形(偏旁、部首)、四角编码等特征,同时可作为tensor输入到模型☆138Updated 5 years ago