thevasudevgupta / transformers-adaptersLinks
This repositary hosts my experiments for the project, I did with OffNote Labs.
☆10Updated 4 years ago
Alternatives and similar repositories for transformers-adapters
Users that are interested in transformers-adapters are comparing it to the libraries listed below
Sorting:
- Code for ACL 2022 paper "Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation"☆30Updated 3 years ago
- Code and data for the IWSLT 2022 shared task on Formality Control for SLT☆21Updated 2 years ago
- Research code for the paper "How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models"☆27Updated 3 years ago
- Implementation of the paper 'Sentence Bottleneck Autoencoders from Transformer Language Models'☆17Updated 3 years ago
- ☆76Updated 4 years ago
- PyTorch implementation of NAACL 2021 paper "Multi-view Subword Regularization"☆25Updated 4 years ago
- Codebase for probing and visualizing multilingual models.☆49Updated 5 years ago
- The implementation of "Neural Machine Translation without Embeddings", NAACL 2021☆33Updated 4 years ago
- A Dataset for Tuning and Evaluation of Sentence Simplification Models with Multiple Rewriting Transformations☆56Updated 2 years ago
- This is the official implementation of NeurIPS 2021 "One Question Answering Model for Many Languages with Cross-lingual Dense Passage Ret…☆71Updated 3 years ago
- EMNLP 2021 Tutorial: Multi-Domain Multilingual Question Answering☆38Updated 3 years ago
- Implementation of Marge, Pre-training via Paraphrasing, in Pytorch☆76Updated 4 years ago
- An official implementation of "BPE-Dropout: Simple and Effective Subword Regularization" algorithm.☆53Updated 4 years ago
- Statistics on multilingual datasets☆17Updated 3 years ago
- ☆46Updated 3 years ago
- This is the official repository for NAACL 2021, "XOR QA: Cross-lingual Open-Retrieval Question Answering".☆79Updated 4 years ago
- As good as new. How to successfully recycle English GPT-2 to make models for other languages (ACL Findings 2021)☆48Updated 3 years ago
- ☆94Updated last year