BERT distillation(基于BERT的蒸馏实验 )
☆314Jul 30, 2020Updated 5 years ago
Alternatives and similar repositories for bert_distill
Users that are interested in bert_distill are comparing it to the libraries listed below
Sorting:
- Knowledge Distillation from BERT☆54Jan 7, 2019Updated 7 years ago
- Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。☆229Jul 27, 2022Updated 3 years ago
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks.☆15Aug 28, 2020Updated 5 years ago
- pytorch implementation for Patient Knowledge Distillation for BERT Model Compression☆203Sep 20, 2019Updated 6 years ago
- Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.☆3,156Jan 22, 2024Updated 2 years ago
- 简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型☆271Oct 24, 2020Updated 5 years ago
- A PyTorch-based knowledge distillation toolkit for natural language processing☆1,696May 8, 2023Updated 2 years ago
- ☆61Nov 14, 2019Updated 6 years ago
- 文本匹配的相关模型DSSM,ESIM,ABCNN,BIMPM等,数据集为LCQMC官方数据☆470May 8, 2022Updated 3 years ago
- The score code of FastBERT (ACL2020)☆609Oct 29, 2021Updated 4 years ago
- 对ACL2020 FastBERT论文的复现,论文地址//arxiv.org/pdf/2004.02178.pdf☆193Dec 15, 2021Updated 4 years ago
- RoBERTa中文预训练模型: RoBERTa for Chinese☆2,773Jul 22, 2024Updated last year
- ☆279Dec 8, 2020Updated 5 years ago
- A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型☆3,984Nov 21, 2022Updated 3 years ago
- 复现论文《Distilling Task-Specific Knowledge from BERT into Simple Neural Networks》☆16Jun 13, 2021Updated 4 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆66Mar 30, 2020Updated 5 years ago
- 高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型☆816Jul 8, 2020Updated 5 years ago
- Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo☆3,106May 9, 2024Updated last year
- CNN、BiLSTM、Bert(3layers)对Bert(12layers)模型的蒸馏的keras实现☆29Mar 3, 2020Updated 6 years ago
- using bilstm-crf,bert and other methods to do sequence tagging task☆415Jun 12, 2023Updated 2 years ago
- Pre-Trained Chinese XLNet(中文XLNet预训练模型)☆1,650Jul 15, 2025Updated 7 months ago
- 天池 疫情相似句对判定大赛 线上第一名方案☆435Oct 17, 2020Updated 5 years ago
- 基于Pytorch的,中文语义相似度匹配模型(ABCNN、Albert、Bert、BIMPM、DecomposableAttention、DistilBert、ESIM、RE2、Roberta、SiaGRU、XlNet)☆796Mar 22, 2020Updated 5 years ago
- NEZHA: Neural Contextualized Representation for Chinese Language Understanding☆259Aug 13, 2021Updated 4 years ago
- Open Language Pre-trained Model Zoo☆1,005Nov 18, 2021Updated 4 years ago
- Text-Similarity Method in Pytorch☆469Dec 9, 2018Updated 7 years ago
- DistilBERT for Chinese 海量中文预训练蒸馏bert模型☆95Dec 5, 2019Updated 6 years ago
- code for ACL 2020 paper: FLAT: Chinese NER Using Flat-Lattice Transformer☆1,004May 10, 2022Updated 3 years ago
- bert蒸馏实践,包含BiLSTM蒸馏BERT和TinyBert☆13Apr 23, 2022Updated 3 years ago
- ccks baidu entity link 实体链接 第一名☆843Dec 19, 2023Updated 2 years ago
- We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training s…☆81Aug 10, 2020Updated 5 years ago
- Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)☆10,177Jul 15, 2025Updated 7 months ago
- Distilling BERT using natural language generation.☆39Aug 13, 2023Updated 2 years ago
- DeepIE: Deep Learning for Information Extraction☆1,943Dec 9, 2022Updated 3 years ago
- ccf 2020 qa match competition top1☆267Jan 28, 2021Updated 5 years ago
- Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard☆1,787Feb 18, 2023Updated 3 years ago
- An implement of the paper of EDA for Chinese corpus.中文语料的EDA数据增强工具。NLP数据增强。论文阅读笔记。☆1,386May 31, 2022Updated 3 years ago
- A simple effective ToolKit for short text matching☆329Jul 19, 2022Updated 3 years ago
- transform multi-label classification as sentence pair task, with more training data and information☆178Dec 13, 2019Updated 6 years ago