google-research / byt5
β501Updated last year
Alternatives and similar repositories for byt5:
Users that are interested in byt5 are comparing it to the libraries listed below
- NL-Augmenter π¦ β π A Collaborative Repository of Natural Language Transformationsβ780Updated 10 months ago
- Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)β327Updated last year
- An efficient implementation of the popular sequence models for text generation, summarization, and translation tasks. https://arxiv.org/pβ¦β432Updated 2 years ago
- Autoregressive Entity Retrievalβ781Updated last year
- Repository containing code for "How to Train BERT with an Academic Budget" paperβ312Updated last year
- XTREME is a benchmark for the evaluation of the cross-lingual generalization ability of pre-trained multilingual models that covers 40 tyβ¦β640Updated 2 years ago
- BLEURT is a metric for Natural Language Generation based on transfer learning.β719Updated last year
- FastFormers - highly efficient transformer models for NLUβ704Updated last year
- MPNet: Masked and Permuted Pre-training for Language Understanding https://arxiv.org/pdf/2004.09297.pdfβ290Updated 3 years ago
- An Analysis Toolkit for Natural Language Generation (Translation, Captioning, Summarization, etc.)β443Updated 2 weeks ago
- β‘ boost inference speed of T5 models by 5x & reduce the model size by 3x.β578Updated last year
- Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractiveβ¦β429Updated last year
- Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.β252Updated 2 years ago
- Officially supported AllenNLP modelsβ538Updated 2 years ago
- Repository for the paper "Optimal Subarchitecture Extraction for BERT"β471Updated 2 years ago
- Task-based datasets, preprocessing, and evaluation for sequence models.β571Updated this week
- [ACL 2021] Learning Dense Representations of Phrases at Scale; EMNLP'2021: Phrase Retrieval Learns Passage Retrieval, Too https://arxiv.oβ¦β603Updated 2 years ago
- Fast BPEβ666Updated 9 months ago
- β239Updated 5 years ago
- A Visual Analysis Tool to Explore Learned Representations in Transformers Modelsβ587Updated last year
- This dataset contains 108,463 human-labeled and 656k noisily labeled pairs that feature the importance of modeling structure, context, anβ¦β557Updated 3 years ago
- β1,268Updated 2 years ago
- SentAugment is a data augmentation technique for NLP that retrieves similar sentences from a large bank of sentences. It can be used in cβ¦β362Updated 3 years ago
- Transformers for Longer Sequencesβ596Updated 2 years ago
- The corresponding code from our paper "DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations". Do not hesitate to oβ¦β380Updated last year
- Adversarial Natural Language Inference Benchmarkβ393Updated 2 years ago
- A tool for holistic analysis of language generations systemsβ467Updated 3 years ago
- Parallelformers: An Efficient Model Parallelization Toolkit for Deploymentβ785Updated last year
- Interpretable Evaluation for AI Systemsβ363Updated 2 years ago
- β461Updated 3 years ago