yifding / hetseqLinks
HetSeq: Distributed GPU Training on Heterogeneous Infrastructure
☆106Updated 2 years ago
Alternatives and similar repositories for hetseq
Users that are interested in hetseq are comparing it to the libraries listed below
Sorting:
- A case study of efficient training of large language models using commodity hardware.☆68Updated 2 years ago
- Code for the Shortformer model, from the ACL 2021 paper by Ofir Press, Noah A. Smith and Mike Lewis.☆147Updated 3 years ago
- ☆103Updated 4 years ago
- Functional deep learning☆108Updated 2 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- The Python library with command line tools to interact with Dynabench(https://dynabench.org/), such as uploading models.☆55Updated 3 years ago
- Trains Transformer model variants. Data isn't shuffled between batches.☆143Updated 2 years ago
- A diff tool for language models☆42Updated last year
- My implementation of DeepMind's Perceiver☆63Updated 4 years ago
- A 🤗-style implementation of BERT using lambda layers instead of self-attention☆69Updated 4 years ago
- Python Research Framework☆106Updated 2 years ago
- GPT, but made only out of MLPs☆89Updated 4 years ago
- A collection of code snippets for my PyTorch Lightning projects☆108Updated 4 years ago
- ☆153Updated 5 years ago
- Official code for "Distributed Deep Learning in Open Collaborations" (NeurIPS 2021)☆117Updated 3 years ago
- A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transfor…☆47Updated 2 years ago
- Docs☆144Updated 7 months ago
- Babysit your preemptible TPUs☆86Updated 2 years ago
- ☆67Updated 2 years ago
- Implementation of the GBST block from the Charformer paper, in Pytorch☆117Updated 4 years ago
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆155Updated last year
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 3 years ago
- A library to create and manage configuration files, especially for machine learning projects.☆78Updated 3 years ago
- Code for scaling Transformers☆26Updated 4 years ago
- Implementation of Feedback Transformer in Pytorch☆107Updated 4 years ago
- A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch☆227Updated 2 years ago
- Helper scripts and notes that were used while porting various nlp models☆46Updated 3 years ago
- Amos optimizer with JEstimator lib.☆82Updated last year
- Swarm training framework using Haiku + JAX + Ray for layer parallel transformer language models on unreliable, heterogeneous nodes☆239Updated 2 years ago
- diagNNose is a Python library that facilitates a broad set of tools for analysing hidden activations of neural models.☆82Updated last year