OptimalFoundation / nadir
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! ๐ฅ๐๐ป
โ14Updated 9 months ago
Alternatives and similar repositories for nadir:
Users that are interested in nadir are comparing it to the libraries listed below
- Utilities for Training Very Large Modelsโ58Updated 6 months ago
- a pipeline for using api calls to agnostically convert unstructured data into structured training dataโ30Updated 6 months ago
- โ49Updated last year
- Deep learning library implemented from scratch in numpy. Mixtral, Mamba, LLaMA, GPT, ResNet, and other experiments.โ51Updated 11 months ago
- Exploring finetuning public checkpoints on filter 8K sequences on Pileโ115Updated 2 years ago
- โ67Updated 2 years ago
- Large scale 4D parallelism pre-training for ๐ค transformers in Mixture of Experts *(still work in progress)*โ81Updated last year
- Machine Learning eXperiment Utilitiesโ46Updated 9 months ago
- Various transformers for FSDP researchโ37Updated 2 years ago
- HomebrewNLP in JAX flavour for maintable TPU-Trainingโ49Updated last year
- ML/DL Math and Method notesโ59Updated last year
- QAmeleon introduces synthetic multilingual QA data using PaLM, a 540B large language model. This dataset was generated by prompt tuning Pโฆโ34Updated last year
- Experiments for efforts to train a new and improved t5โ77Updated 11 months ago
- โ60Updated 3 years ago
- some common Huggingface transformers in maximal update parametrization (ยตP)โ80Updated 3 years ago
- Arrakis is a library to conduct, track and visualize mechanistic interpretability experiments.โ26Updated 3 weeks ago
- Collection of autoregressive model implementationโ83Updated last month
- โ20Updated last year
- โ48Updated last year
- Proof-of-concept of global switching between numpy/jax/pytorch in a library.โ18Updated 9 months ago
- Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amountโฆโ53Updated last year
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.โ93Updated 2 years ago
- A library to create and manage configuration files, especially for machine learning projects.โ77Updated 3 years ago
- Using short models to classify long textsโ21Updated 2 years ago
- โ22Updated last year
- Experiments with generating opensource language model assistantsโ97Updated last year
- Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)โ61Updated last year
- My explorations into editing the knowledge and memories of an attention networkโ34Updated 2 years ago
- A case study of efficient training of large language models using commodity hardware.โ69Updated 2 years ago
- Automatically take good care of your preemptible TPUsโ36Updated last year