csebuetnlp / CrossSumLinks
This repository contains the code, data, and models of the paper titled "CrossSum: Beyond English-Centric Cross-Lingual Summarization for 1,500+ Language Pairs" published in Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL’23), July 9-14, 2023.
☆53Updated last year
Alternatives and similar repositories for CrossSum
Users that are interested in CrossSum are comparing it to the libraries listed below
Sorting:
- This repository contains the code, data, and models of the paper titled "XL-Sum: Large-Scale Multilingual Abstractive Summarization for 4…☆275Updated last year
- ☆47Updated last month
- Multilingual abstractive summarization dataset extracted from WikiHow.☆95Updated 7 months ago
- The official code for PRIMERA: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization☆157Updated 2 years ago
- ☆102Updated 2 weeks ago
- Official Implementation of "DialogLM: Pre-trained Model for Long Dialogue Understanding and Summarization."☆142Updated 2 years ago
- The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)☆53Updated 3 years ago
- Models for automatically transforming toxic text to neutral☆35Updated 2 years ago
- Official code for LEWIS, from: "LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer", ACL-IJCNLP 2021 Findings by Machel Rei…☆31Updated 3 years ago
- Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.☆50Updated 4 years ago
- Abstractive opinion summarization system (SelSum) and the largest dataset of Amazon product summaries (AmaSum). EMNLP 2021 conference pap…☆47Updated 3 years ago
- ☆100Updated last year
- Interpreting Language Models with Contrastive Explanations (EMNLP 2022 Best Paper Honorable Mention)☆62Updated 3 years ago
- ☆71Updated 4 years ago
- [EMNLP'21] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.☆78Updated 3 years ago
- A Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)☆137Updated last year
- Code for NAACL 2021 full paper "Efficient Attentions for Long Document Summarization"☆68Updated 4 years ago
- ☆203Updated 3 years ago
- Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)☆30Updated 3 years ago
- Research code for "What to Pre-Train on? Efficient Intermediate Task Selection", EMNLP 2021☆37Updated 3 years ago
- ☆63Updated 2 years ago
- An official repository for MIA 2022 (NAACL 2022 Workshop) Shared Task on Cross-lingual Open-Retrieval Question Answering.☆31Updated 3 years ago
- FRANK: Factuality Evaluation Benchmark☆59Updated 2 years ago
- Source code for paper "Learning from Noisy Labels for Entity-Centric Information Extraction", EMNLP 2021☆55Updated 3 years ago
- A Dataset for Tuning and Evaluation of Sentence Simplification Models with Multiple Rewriting Transformations☆56Updated 3 years ago
- ☆41Updated 2 years ago
- Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)☆49Updated 3 years ago
- This repo supports various cross-lingual transfer learning & multilingual NLP models.☆92Updated 2 years ago
- CodemixedNLP: An Extensible and Open NLP Toolkit for Code-Switching☆18Updated 4 years ago
- code associated with ACL 2021 DExperts paper☆118Updated 2 years ago