pytorch-tpu / examples
This repository contains example code to build models on TPUs
☆30Updated 2 years ago
Alternatives and similar repositories for examples:
Users that are interested in examples are comparing it to the libraries listed below
- A case study of efficient training of large language models using commodity hardware.☆69Updated 2 years ago
- HomebrewNLP in JAX flavour for maintable TPU-Training☆49Updated last year
- Various transformers for FSDP research☆37Updated 2 years ago
- TPU support for the fastai library☆13Updated 3 years ago
- A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transfor…☆47Updated last year
- Code for scaling Transformers☆26Updated 4 years ago
- Standalone pre-training recipe with JAX+Flax☆31Updated 2 years ago
- PyTorch implementation of GLOM☆21Updated 3 years ago
- LM Pretraining with PyTorch/TPU☆134Updated 5 years ago
- ☆87Updated 2 years ago
- Helper scripts and notes that were used while porting various nlp models☆46Updated 3 years ago
- A library for squeakily cleaning and filtering language datasets.☆46Updated last year
- Your fruity companion for transformers☆14Updated 2 years ago
- ☆60Updated 3 years ago
- My explorations into editing the knowledge and memories of an attention network☆34Updated 2 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆77Updated 3 years ago
- Dense Passage Retrieval using tensorflow-keras on TPU☆15Updated 3 years ago
- Language-agnostic BERT Sentence Embedding (LaBSE) Pytorch Model☆21Updated 4 years ago
- A place to store reusable transformer components of my own creation or found on the interwebs☆48Updated this week
- A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-…☆67Updated last year
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆26Updated 2 years ago
- QAmeleon introduces synthetic multilingual QA data using PaLM, a 540B large language model. This dataset was generated by prompt tuning P…☆34Updated last year
- Implementation of OpenAI paper with Simple Noise Scale on Fastai V2☆19Updated 3 years ago
- Amos optimizer with JEstimator lib.☆82Updated 10 months ago
- A diff tool for language models☆42Updated last year
- Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch☆73Updated 2 years ago
- Babysit your preemptible TPUs☆85Updated 2 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- GPT, but made only out of MLPs☆88Updated 3 years ago
- Code repo for "Transformer on a Diet" paper☆31Updated 4 years ago