OptimalFoundation / nadirLinks
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! π₯ππ»
β14Updated last year
Alternatives and similar repositories for nadir
Users that are interested in nadir are comparing it to the libraries listed below
Sorting:
- β34Updated 2 years ago
- Exploring finetuning public checkpoints on filter 8K sequences on Pileβ116Updated 2 years ago
- A case study of efficient training of large language models using commodity hardware.β68Updated 3 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)β188Updated 3 years ago
- Like picoGPT but for BERT.β50Updated 2 years ago
- Large scale 4D parallelism pre-training for π€ transformers in Mixture of Experts *(still work in progress)*β87Updated last year
- Functional local implementations of main model parallelism approachesβ96Updated 2 years ago
- Amos optimizer with JEstimator lib.β82Updated last year
- HomebrewNLP in JAX flavour for maintable TPU-Trainingβ50Updated last year
- β61Updated 3 years ago
- Experiments with generating opensource language model assistantsβ97Updated 2 years ago
- Deep learning library implemented from scratch in numpy. Mixtral, Mamba, LLaMA, GPT, ResNet, and other experiments.β52Updated last year
- ML/DL Math and Method notesβ63Updated last year
- Scripts to convert datasets from various sources to Hugging Face Datasets.β57Updated 2 years ago
- Serialize JAX, Flax, Haiku, or Objax model params with π€`safetensors`β46Updated last year
- Smol but mighty language modelβ63Updated 2 years ago
- Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)β61Updated 2 years ago
- Various handy scripts to quickly setup new Linux and Windows sandboxes, containers and WSL.β40Updated last week
- Transformer Grammars: Augmenting Transformer Language Models with Syntactic Inductive Biases at Scale, TACL (2022)β130Updated 3 months ago
- This code repository contains the code used for my "Optimizing Memory Usage for Training LLMs and Vision Transformers in PyTorch" blog poβ¦β92Updated 2 years ago
- Experiments for efforts to train a new and improved t5β76Updated last year
- Various transformers for FSDP researchβ38Updated 2 years ago
- A library to create and manage configuration files, especially for machine learning projects.β79Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.β95Updated 2 years ago
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Dayβ256Updated last year
- β94Updated last year
- β49Updated last year
- An interactive exploration of Transformer programming.β269Updated last year
- Genalog is an open source, cross-platform python package allowing generation of synthetic document images with custom degradations and teβ¦β42Updated last year
- Multi-Domain Expert Learningβ67Updated last year