lxchtan / NAIC_AI_Plus
Code for 2020 AI plus wireless communication competition.
☆14Updated 4 years ago
Alternatives and similar repositories for NAIC_AI_Plus
Users that are interested in NAIC_AI_Plus are comparing it to the libraries listed below
Sorting:
- Source code for "Training Generative Adversarial Networks Via Turing Test".☆13Updated 4 years ago
- adafactor optimizer for keras☆20Updated 3 years ago
- Python下shuffle几百G文件☆33Updated 3 years ago
- bert-of-theseus via bert4keras☆31Updated 4 years ago
- lightweighted deep learning inference service framework☆39Updated 3 years ago
- Natural Language Procesing☆34Updated 4 years ago
- Keras implement of Lazy optimizer☆21Updated 5 years ago
- bert4keras实现gpt下中国象棋☆44Updated 4 years ago
- 高性能小模型测评 Shared Tasks in NLPCC 2020. Task 1 - Light Pre-Training Chinese Language Model for NLP Task☆58Updated 4 years ago
- 对比学习 虾皮同款商品匹配☆14Updated 3 years ago
- saving memory by recomputing for keras☆37Updated 5 years ago
- 使用numpy构建cnn复习深度学习知识☆35Updated 6 years ago
- RAdam optimizer for keras☆71Updated 5 years ago
- Chinese Natural Language Correction via Language Model☆14Updated 7 years ago
- ☆12Updated 7 years ago
- 2019达观杯信息提取第5名代码☆20Updated 5 years ago
- multi-task classifier☆22Updated last year
- 图片简易标注工具,标注类似ICDAR数据集,支持多边形标注,文本标注,方便OCR数据集标注。☆54Updated 5 years ago
- The 300 lines of code (Tensorflow 2) completely replicates the Transformer model and is used in neural machine translation tasks and chat…☆27Updated 5 years ago
- 使用tensorflow.keras实现《动手学深度学习》☆10Updated 5 years ago
- CLUE Emotion Analysis Dataset 细粒度情感分析数据集☆8Updated 5 years ago
- 无监督文本生成的一些方法☆48Updated 3 years ago
- Adversarial Training for NLP in Keras☆46Updated 5 years ago
- NMT model with BERT in tensorflow 2.0☆20Updated 5 years ago
- 用bert4keras来解小学数学应用题☆77Updated 4 years ago
- XLNet: Generalized Autoregressive Pretraining for Language Understanding 论文的中文翻译 Paper Chinese Translation!☆49Updated 5 years ago
- memory efficient densenet+lstm+ctc实现中文识别☆31Updated 2 years ago
- TestB榜第10的方案,bleu32.1☆64Updated 5 years ago
- 无人监考团队的存储库☆16Updated 2 years ago
- 基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。☆16Updated 5 years ago