maroxtn / mt5-M2M-comparisonLinks
Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-resources-translation-english-yoruba-ef56624d2b75
☆16Updated 4 years ago
Alternatives and similar repositories for mt5-M2M-comparison
Users that are interested in mt5-M2M-comparison are comparing it to the libraries listed below
Sorting:
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.☆105Updated 3 years ago
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆82Updated 2 years ago
- Some notebooks for NLP☆207Updated last year
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+☆37Updated 4 years ago
- MAFAND-MT☆58Updated last year
- Abstractive and Extractive Text summarization using Transformers.☆86Updated 2 years ago
- ☆34Updated 4 years ago
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆156Updated 2 years ago
- The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)☆53Updated 3 years ago
- Open source library for few shot NLP☆79Updated 2 years ago
- Google's Meena transformer chatbot implementation☆105Updated 3 years ago
- MobileBERT and DistilBERT for extractive summarization☆92Updated 2 years ago
- Efficient Attention for Long Sequence Processing☆98Updated last year
- A python package to augment text data using NLP.☆39Updated 7 months ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆156Updated last year
- Python-based implementation of the Translate-Align-Retrieve method to automatically translate the SQuAD Dataset to Spanish.☆59Updated 2 years ago
- Comprehensive NLP Evaluation System☆188Updated last year
- TorchServe+Streamlit for easily serving your HuggingFace NER models☆33Updated 3 years ago
- Benchmarking various Deep Learning models such as BERT, ALBERT, BiLSTMs on the task of sentence entailment using two datasets - MultiNLI …☆28Updated 4 years ago
- [EMNLP 2021] LM-Critic: Language Models for Unsupervised Grammatical Error Correction☆120Updated 4 years ago
- Arabic edition of ALBERT pretrained language models☆16Updated 4 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 3 years ago
- Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher m…☆25Updated 4 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆95Updated 2 years ago
- ☆184Updated 2 years ago
- Implementation of Z-BERT-A: a zero-shot pipeline for unknown intent detection.☆43Updated 2 years ago
- Zero-shot Transfer Learning from English to Arabic☆30Updated 3 years ago
- Paraphrase any question with T5 (Text-To-Text Transfer Transformer) - Pretrained model and training script provided☆185Updated 2 years ago
- A search engine for ParlAI's BlenderBot project (and probably other ones as well)☆130Updated 3 years ago