zphang / minimal-gpt-neox-20b
☆128Updated 2 years ago
Related projects ⓘ
Alternatives and complementary repositories for minimal-gpt-neox-20b
- Experiments with generating opensource language model assistants☆97Updated last year
- Exploring finetuning public checkpoints on filter 8K sequences on Pile☆115Updated last year
- DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.☆164Updated 6 months ago
- Simple Annotated implementation of GPT-NeoX in PyTorch☆111Updated 2 years ago
- ☆64Updated 2 years ago
- Used for adaptive human in the loop evaluation of language and embedding models.☆304Updated last year
- RWKV-v2-RNN trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.☆66Updated 2 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆185Updated 2 years ago
- Inference code for LLaMA models in JAX☆113Updated 6 months ago
- See the issue board for the current status of active and prospective projects!☆65Updated 2 years ago
- One stop shop for all things carp☆59Updated 2 years ago
- A Multilingual Dataset for Parsing Realistic Task-Oriented Dialogs☆113Updated last year
- Instruct-tuning LLaMA on consumer hardware☆66Updated last year
- MiniHF is an inference, human preference data collection, and fine-tuning tool for local language models. It is intended to help the user…☆151Updated this week
- This project aims to make RWKV Accessible to everyone using a Hugging Face like interface, while keeping it close to the R and D RWKV bra…☆64Updated last year
- Framework agnostic python runtime for RWKV models☆145Updated last year
- Multi-Domain Expert Learning☆67Updated 9 months ago
- Datasets collection and preprocessings framework for NLP extreme multitask learning☆149Updated 4 months ago
- JAX implementation of the Llama 2 model☆210Updated 9 months ago
- Pipeline for pulling and processing online language model pretraining data from the web☆174Updated last year
- 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.☆55Updated 2 years ago
- Code repository for the c-BTM paper☆105Updated last year
- A library for squeakily cleaning and filtering language datasets.☆45Updated last year
- Train very large language models in Jax.☆195Updated last year
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆432Updated last year
- Finetune Falcon, LLaMA, MPT, and RedPajama on consumer hardware using PEFT LoRA☆101Updated 3 months ago
- Train vision models using JAX and 🤗 transformers☆95Updated 3 weeks ago
- Erasing concepts from neural representations with provable guarantees☆209Updated last week
- A crude RLHF layer on top of nanoGPT with Gumbel-Softmax trick☆287Updated 11 months ago