tshrjn / Finetune-QA
BERT, RoBERTa fine-tuning over SQuAD Dataset using pytorch-lightning⚡️, 🤗-transformers & 🤗-nlp.
☆36Updated last year
Related projects ⓘ
Alternatives and complementary repositories for Finetune-QA
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 2 years ago
- Code and datasets of "Multilingual Extractive Reading Comprehension by Runtime Machine Translation"☆39Updated 5 years ago
- Fine-tune transformers with pytorch-lightning☆44Updated 2 years ago
- Using business-level retrieval system (BM25) with Python in just a few lines.☆31Updated last year
- Implementation of Marge, Pre-training via Paraphrasing, in Pytorch☆75Updated 3 years ago
- ☆91Updated 8 months ago
- This is the official implementation of NeurIPS 2021 "One Question Answering Model for Many Languages with Cross-lingual Dense Passage Ret…☆70Updated 2 years ago
- On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines☆131Updated last year
- ☆37Updated last year
- ☆20Updated 3 years ago
- ☆73Updated 3 years ago
- This repositary hosts my experiments for the project, I did with OffNote Labs.☆11Updated 3 years ago
- Helper scripts and notes that were used while porting various nlp models☆44Updated 2 years ago
- Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks☆63Updated 2 years ago
- ☆42Updated 4 years ago
- EMNLP 2021 Tutorial: Multi-Domain Multilingual Question Answering☆38Updated 3 years ago
- Data & Code for ACCENTOR: "Adding Chit-Chat to Enhance Task-Oriented Dialogues" (NAACL 2021)☆71Updated 3 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+☆37Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆92Updated last year
- ☆30Updated 4 years ago
- ARCHIVED. Please use https://docs.adapterhub.ml/huggingface_hub.html || 🔌 A central repository collecting pre-trained adapter modules☆68Updated 5 months ago
- The implementation of "Neural Machine Translation without Embeddings", NAACL 2021☆33Updated 3 years ago
- ☆95Updated last year
- This is the official repository for NAACL 2021, "XOR QA: Cross-lingual Open-Retrieval Question Answering".☆78Updated 3 years ago
- PyTorch reimplementation of REALM and ORQA☆22Updated 2 years ago
- ☆97Updated 2 years ago
- An official implementation of "BPE-Dropout: Simple and Effective Subword Regularization" algorithm.☆48Updated 3 years ago
- ☆47Updated 4 years ago
- A BART version of an open-domain QA model in a closed-book setup☆119Updated 4 years ago
- Official Pytorch Implementation of Length-Adaptive Transformer (ACL 2021)☆100Updated 4 years ago