Shivanandroy / simpleT5
simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.
☆393Updated last year
Alternatives and similar repositories for simpleT5:
Users that are interested in simpleT5 are comparing it to the libraries listed below
- A repo to explore different NLP tasks which can be solved using T5☆172Updated 4 years ago
- Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive…☆431Updated 2 years ago
- This repository contains the code for "Generating Datasets with Pretrained Language Models".☆188Updated 3 years ago
- ⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.☆578Updated 2 years ago
- UnifiedQA: Crossing Format Boundaries With a Single QA System☆435Updated 2 years ago
- [ACL 2021] Learning Dense Representations of Phrases at Scale; EMNLP'2021: Phrase Retrieval Learns Passage Retrieval, Too https://arxiv.o…☆604Updated 2 years ago
- Resources for the "SummEval: Re-evaluating Summarization Evaluation" paper☆391Updated 10 months ago
- Fine tune a T5 transformer model using PyTorch & Transformers🤗☆212Updated 4 years ago
- ☆182Updated last year
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆463Updated 2 years ago
- BARTScore: Evaluating Generated Text as Text Generation☆349Updated 2 years ago
- BLEURT is a metric for Natural Language Generation based on transfer learning.☆726Updated last year
- NL-Augmenter 🦎 → 🐍 A Collaborative Repository of Natural Language Transformations☆783Updated 11 months ago
- Resources for the "Evaluating the Factual Consistency of Abstractive Text Summarization" paper☆297Updated this week
- A Neural Language Style Transfer framework to transfer natural language text smoothly between fine-grained language styles like formal/ca…☆485Updated last year
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆437Updated last year
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆156Updated 2 years ago
- Efficient Attention for Long Sequence Processing☆93Updated last year
- ☆190Updated last year
- A python library that makes AMR parsing, generation and visualization simple.☆238Updated last year
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- This dataset contains synthetic training data for grammatical error correction. The corpus is generated by corrupting clean sentences fro…☆161Updated 7 months ago
- [NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation☆471Updated last year
- Obtain Word Alignments using Pretrained Language Models (e.g., mBERT)☆360Updated last year
- Official code and data repository for our EMNLP 2020 long paper "Reformulating Unsupervised Style Transfer as Paraphrase Generation" (htt…☆236Updated 2 years ago
- Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: …☆333Updated last year
- Repo for fine-tuning Casual LLMs☆454Updated last year
- Language model fine-tuning on NER with an easy interface and cross-domain evaluation. "T-NER: An All-Round Python Library for Transformer…☆387Updated last year
- ☆345Updated 3 years ago
- ACL 2022: BRIO: Bringing Order to Abstractive Summarization☆335Updated 6 months ago