yym6472 / back_translateLinks
基于回译增强数据,目前整合了百度、有道、谷歌(需翻墙)翻译。
☆21Updated 5 years ago
Alternatives and similar repositories for back_translate
Users that are interested in back_translate are comparing it to the libraries listed below
Sorting:
- ☆29Updated 6 years ago
- ☆23Updated 6 years ago
- Source code for paper "LET: Linguistic Knowledge Enhanced Graph Transformer for Chinese Short Text Matching", AAAI2021.☆48Updated 4 years ago
- CCL2020,“小牛杯”幽默计算任务数据发布☆23Updated last year
- 本项目是CCKS2020实体链指比赛baseline(pytorch)☆19Updated 5 years ago
- The code for "A Unified MRC Framework for Named Entity Recognition"☆33Updated 6 years ago
- 2020语言与智能技术竞赛:面向推荐的对话任务☆52Updated 4 years ago
- IPRE: a Dataset for Inter-Personal Relationship Extraction☆95Updated 6 years ago
- ☆25Updated 6 years ago
- This is the repository for NLPCC2020 task AutoIE☆52Updated 5 years ago
- DescriptionPairsExtraction, entity and it's description pairs extract program based on Albert and data back-annotation. 基于Albert与结构化数据回标思…☆20Updated 3 years ago
- ☆61Updated 6 years ago
- 24*2个预训练的小型BERT模型,NLPer炼丹利器☆51Updated 5 years ago
- Source code for "Train No Evil: Selective Masking for Task-Guided Pre-Training"☆70Updated 3 years ago
- Revised Version of SAT Model in "Improved Word Representation Learning with Sememes"☆50Updated 5 years ago
- Negative sampling for solving the unlabeled entity problem in NER. ICLR-2021 paper: Empirical Analysis of Unlabeled Entity Problem in Nam…☆134Updated 3 years ago
- CCL2019,“小牛杯”中文幽默计算任务的数据集及baseline☆24Updated last year
- This repository implements the system described in "Growing Story Forest Online from Massive Breaking News"☆65Updated 7 years ago
- Chinese Machine Reading 2021海华AI挑战赛·中文阅读理解·技术组·第三名☆21Updated 4 years ago
- 2020智源-京东多模态对话(JDDC2020)第三名解决方案分享☆42Updated 5 years ago
- 使用BERT解决lic2019机器阅读理解☆90Updated 6 years ago
- transformers implement (architecture, task example, serving and more)☆96Updated 3 years ago
- Code for EMNLP 2019 paper "A Boundary-aware Neural Model for Nested Named Entity Recognition"☆89Updated 3 years ago
- 这是一个seq2seq模型,编码器是bert,解码器是transformer的解码器,可用于自然语言处理中文本生成领域的任务☆75Updated 6 years ago
- ☆67Updated 4 years ago
- BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement☆74Updated 5 years ago
- 这是使用pytoch 实现的长文本分类器☆46Updated 6 years ago
- this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large☆66Updated 5 years ago
- 中文版unilm预训练模型☆83Updated 4 years ago
- 无监督文本生成的一些方法☆49Updated 4 years ago