maroxtn / mt5-M2M-comparisonLinks
Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-resources-translation-english-yoruba-ef56624d2b75
☆16Updated 4 years ago
Alternatives and similar repositories for mt5-M2M-comparison
Users that are interested in mt5-M2M-comparison are comparing it to the libraries listed below
Sorting:
- Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher m…☆26Updated 4 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 3 years ago
- ☆184Updated 2 years ago
- TorchServe+Streamlit for easily serving your HuggingFace NER models☆33Updated 3 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+☆37Updated 4 years ago
- MAFAND-MT☆60Updated last year
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆157Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.☆105Updated 3 years ago
- Some notebooks for NLP☆207Updated 2 years ago
- A repo to explore different NLP tasks which can be solved using T5☆173Updated 5 years ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆157Updated 2 years ago
- An example of multilingual machine translation using a pretrained version of mt5 from Hugging Face.☆42Updated 4 years ago
- Open source library for few shot NLP☆78Updated 2 years ago
- Efficient Attention for Long Sequence Processing☆98Updated 2 years ago
- PyTorch implementation of NMT models along with custom tokenizers, models, and datasets☆21Updated 3 years ago
- DialogSum: A Real-life Scenario Dialogue Summarization Dataset - Findings of ACL 2021☆183Updated last year
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 3 years ago
- Abstractive and Extractive Text summarization using Transformers.☆86Updated 2 years ago
- MobileBERT and DistilBERT for extractive summarization☆93Updated 2 years ago
- NTREX -- News Test References for MT Evaluation☆87Updated last year
- Helper scripts and notes that were used while porting various nlp models☆49Updated 3 years ago
- Zero-shot Transfer Learning from English to Arabic☆30Updated 3 years ago
- Python-based implementation of the Translate-Align-Retrieve method to automatically translate the SQuAD Dataset to Spanish.☆59Updated 3 years ago
- ☆34Updated 5 years ago
- PyTorch – SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models.☆62Updated 3 years ago
- pyTorch implementation of Recurrence over BERT (RoBERT) based on this paper https://arxiv.org/abs/1910.10781 and comparison with pyTorch …☆82Updated 3 years ago
- This dataset contains synthetic training data for grammatical error correction. The corpus is generated by corrupting clean sentences fro…☆162Updated last year
- Pipeline for pulling and processing online language model pretraining data from the web☆177Updated 2 years ago
- ☆94Updated 3 years ago