pytorch-tpu / transformers
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
☆17Updated this week
Alternatives and similar repositories for transformers:
Users that are interested in transformers are comparing it to the libraries listed below
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 2 years ago
- A Framework aims to wisely initialize unseen subword embeddings in PLMs for efficient large-scale continued pretraining☆16Updated last year
- ☆97Updated 2 years ago
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆71Updated last year
- Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks☆63Updated 3 years ago
- ☆72Updated last year
- ☆54Updated 2 years ago
- Calculating Expected Time for training LLM.☆38Updated last year
- Repo for ICML23 "Why do Nearest Neighbor Language Models Work?"☆56Updated 2 years ago
- Evaluation pipeline for the BabyLM Challenge 2023.☆75Updated last year
- ☆97Updated 2 years ago
- Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.☆80Updated 6 months ago
- ☆44Updated 4 years ago
- The official implemetation of "Evidentiality-guided Generation for Knowledge-Intensive NLP Tasks" (NAACL 2022).☆43Updated 2 years ago
- Megatron LM 11B on Huggingface Transformers☆27Updated 3 years ago
- ☆38Updated 11 months ago
- The official code of EMNLP 2022, "SCROLLS: Standardized CompaRison Over Long Language Sequences".☆69Updated last year
- ☆65Updated last year
- Research code for the paper "How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models"☆26Updated 3 years ago
- Official repository for our EACL 2023 paper "LongEval: Guidelines for Human Evaluation of Faithfulness in Long-form Summarization" (https…☆43Updated 8 months ago
- DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization (ACL 2022)☆50Updated last year
- Code for ACL 2022 paper "Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation"☆30Updated 3 years ago
- Mr. TyDi is a multi-lingual benchmark dataset built on TyDi, covering eleven typologically diverse languages.☆75Updated 3 years ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆111Updated last year
- Tools for evaluating the performance of MT metrics on data from recent WMT metrics shared tasks.☆103Updated 3 weeks ago
- Hugging Face RoBERTa with Flash Attention 2☆22Updated last year
- M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyer☆55Updated 2 years ago
- ☆24Updated 2 years ago
- data related codebase for polyglot project☆19Updated 2 years ago