ares5221 / Common-NLP-Datasets
☆13Updated 4 years ago
Alternatives and similar repositories for Common-NLP-Datasets:
Users that are interested in Common-NLP-Datasets are comparing it to the libraries listed below
- 用bert4keras加载CDial-GPT☆38Updated 4 years ago
- 大规模中文语料☆40Updated 5 years ago
- ☆101Updated 4 years ago
- pytorch版bert权重转tf☆21Updated 4 years ago
- 高性能小模型测评 Shared Tasks in NLPCC 2020. Task 1 - Light Pre-Training Chinese Language Model for NLP Task☆57Updated 4 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆90Updated 5 years ago
- 中文版unilm预训练模型☆83Updated 3 years ago
- bert-of-theseus via bert4keras☆31Updated 4 years ago
- 基于百度webqa与dureader数据集训练的Albert Large QA模型☆75Updated 4 years ago
- 离线端阅读理解应用 QA for mobile, Android & iPhone☆60Updated 2 years ago
- 无监督文本生成的一些方法☆48Updated 3 years ago
- 时间抽取、解析、标准化工具☆50Updated 2 years ago
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆47Updated last year
- Python下shuffle几百G文件☆33Updated 3 years ago
- chinese anti semantic word search interface based on dict crawled from online resources, ChineseAntiword,针对中文词语的反义词查询接口☆59Updated 6 years ago
- tensorflow version of bert-of-theseus☆62Updated 4 years ago
- ☆32Updated 3 years ago
- codes for ai challenger 2018 machine reading comprehension☆27Updated 6 years ago
- 基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和 IDCNN+CRF,更多算法正在持续添加中)实现中文分词(Tokenizer / segmentation)、词性标注…☆84Updated 2 years ago
- flyai 医疗QA NLG☆21Updated 5 years ago
- soft_mask_bert model for Chinese Spelling Correction in keras☆21Updated 4 years ago
- ☆89Updated 4 years ago
- 适用于常见的NLP任务的模板☆34Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understanding 论文的中文翻译 Paper Chinese Translation!☆49Updated 5 years ago
- 用bert4keras来解小学数学应用题☆77Updated 4 years ago
- 天池-新冠疫情相似句对判定大赛 大白_Rank6☆21Updated 4 years ago
- 基于轻量级的albert实现albert+BiLstm+CRF☆88Updated last year
- This is a collection of convenient methods for data science competition.☆42Updated 4 years ago
- 零样本学习测评基准,中文版☆54Updated 3 years ago
- ☆49Updated 3 years ago