EleutherAI / osloLinks
OSLO: Open Source for Large-scale Optimization
☆175Updated last year
Alternatives and similar repositories for oslo
Users that are interested in oslo are comparing it to the libraries listed below
Sorting:
- OSLO: Open Source framework for Large-scale model Optimization☆309Updated 2 years ago
- Data processing system for polyglot☆91Updated last year
- Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*☆85Updated last year
- Anh - LAION's multilingual assistant datasets and models☆27Updated 2 years ago
- evolve llm training instruction, from english instruction to any language.☆118Updated last year
- some common Huggingface transformers in maximal update parametrization (µP)☆81Updated 3 years ago
- FriendliAI Model Hub☆91Updated 3 years ago
- Exploring finetuning public checkpoints on filter 8K sequences on Pile☆116Updated 2 years ago
- A minimal PyTorch Lightning OpenAI GPT w DeepSpeed Training!☆112Updated 2 years ago
- data related codebase for polyglot project☆19Updated 2 years ago
- Sakura-SOLAR-DPO: Merge, SFT, and DPO☆116Updated last year
- ☆67Updated 2 years ago
- manage histories of LLM applied applications☆91Updated last year
- ☆15Updated last month
- Inference code for LLaMA models in JAX☆118Updated last year
- Pipeline for pulling and processing online language model pretraining data from the web☆178Updated last year
- Experiments with generating opensource language model assistants☆97Updated 2 years ago
- Implementation of stop sequencer for Huggingface Transformers☆16Updated 2 years ago
- Calculating Expected Time for training LLM.☆38Updated 2 years ago
- ☆19Updated 2 years ago
- 🥤🧑🏻🚀Code and dataset for our EMNLP 2023 paper - "SODA: Million-scale Dialogue Distillation with Social Commonsense Contextualization…☆232Updated last year
- [ICLR 2023] Guess the Instruction! Flipped Learning Makes Language Models Stronger Zero-Shot Learners☆116Updated 3 weeks ago
- Large-scale language modeling tutorials with PyTorch☆291Updated 3 years ago
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- ☆166Updated 2 years ago
- Code for the paper "The Impact of Positional Encoding on Length Generalization in Transformers", NeurIPS 2023☆136Updated last year
- The original implementation of Min et al. "Nonparametric Masked Language Modeling" (paper https//arxiv.org/abs/2212.01349)☆157Updated 2 years ago
- Pytorch/XLA SPMD Test code in Google TPU☆23Updated last year
- 🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.☆17Updated last month
- [AAAI 2024] Investigating the Effectiveness of Task-Agnostic Prefix Prompt for Instruction Following☆79Updated 10 months ago