bytedance / ParaGen
ParaGen is a PyTorch deep learning framework for parallel sequence generation.
☆186Updated 2 years ago
Alternatives and similar repositories for ParaGen:
Users that are interested in ParaGen are comparing it to the libraries listed below
- ☆166Updated 3 years ago
- Implementation of "Glancing Transformer for Non-Autoregressive Neural Machine Translation"☆137Updated 2 years ago
- ☆53Updated 3 years ago
- ☆120Updated 3 years ago
- Introduction to CPM☆165Updated 3 years ago
- A unified tokenization tool for Images, Chinese and English.☆152Updated 2 years ago
- Finetune CPM-1☆74Updated 2 years ago
- Code, Data and Demo for Paper: Controllable Generation from Pre-trained Language Models via Inverse Prompting☆123Updated 2 years ago
- 基于Gated Attention Unit的Transformer模型(尝鲜版)☆97Updated 2 years ago
- Code for CPM-2 Pre-Train☆158Updated 2 years ago
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆75Updated 2 years ago
- NTK scaled version of ALiBi position encoding in Transformer.☆67Updated last year
- FLASHQuad_pytorch☆67Updated 3 years ago
- Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark☆129Updated last year
- 零样本学习测评基准,中文版☆56Updated 3 years ago
- ☆218Updated 2 years ago
- 中文图书语料MD5链接☆217Updated last year
- reStructured Pre-training☆98Updated 2 years ago
- This is a code repository for the ACL 2022 paper "Learning to Generalize to More: Continuous Semantic Augmentation for Neural Machine Tra…☆52Updated 2 years ago
- RoFormer升级版☆153Updated 2 years ago
- 香侬科技(北京香侬慧语科技有限责任公司)知乎爆料备份☆41Updated 4 years ago
- Tracking the progress in NLG for task-oriented dialogue system (resources, code, and new frontiers etc.)☆134Updated 3 years ago
- Finetune CPM-2☆82Updated 2 years ago
- Pretrain CPM-1☆51Updated 4 years ago
- ☆76Updated last year
- EVA: Large-scale Pre-trained Chit-Chat Models☆307Updated 2 years ago
- Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension☆166Updated 3 years ago
- A paper list of pre-trained language models (PLMs).☆139Updated 3 years ago
- a Fast, Flexible, Extensible and Easy-to-use NLP Large-scale Pretraining and Multi-task Learning Framework.☆183Updated 4 years ago
- This is the official code repository for NumNet+(https://leaderboard.allenai.org/drop/submission/blu418v76glsbnh1qvd0)☆177Updated 8 months ago