VE-FORBRYDERNE / mtj-softtunerLinks
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance
β28Updated 2 years ago
Alternatives and similar repositories for mtj-softtuner
Users that are interested in mtj-softtuner are comparing it to the libraries listed below
Sorting:
- π€Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.β55Updated 3 years ago
- Hidden Engrams: Long Term Memory for Transformer Model Inferenceβ35Updated 4 years ago
- One stop shop for all things carpβ59Updated 3 years ago
- Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compressionβ68Updated 3 years ago
- Conversational Language model toolkit for training against human preferences.β41Updated last year
- Experimental sampler to make LLMs more creativeβ31Updated 2 years ago
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)β36Updated 4 years ago
- Experiments with generating opensource language model assistantsβ97Updated 2 years ago
- Training & Implementation of chatbots leveraging GPT-like architecture with the aitextgen package to enable dynamic conversations.β49Updated 3 years ago
- β27Updated 2 years ago
- A ready-to-deploy container for implementing an easy to use REST API to access Language Models.β66Updated 2 years ago
- Text-writing denoising diffusion (and much more)β30Updated 2 years ago
- β50Updated 2 years ago
- β33Updated 2 years ago
- β131Updated 3 years ago
- β44Updated 2 years ago
- RWKV-v2-RNN trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.β66Updated 3 years ago
- A client library for LAION's effort to filter CommonCrawl with CLIP, building a large scale image-text dataset.β31Updated 2 years ago
- Reimplementation of the task generation part from the Alpaca paperβ118Updated 2 years ago
- This project aims to make RWKV Accessible to everyone using a Hugging Face like interface, while keeping it close to the R and D RWKV braβ¦β64Updated 2 years ago
- This repository contains all the code for collecting large scale amounts of code from GitHub.β109Updated 2 years ago
- Framework agnostic python runtime for RWKV modelsβ146Updated 2 years ago
- Prompt tuning toolkit for GPT-2 and GPT-Neoβ89Updated 4 years ago
- β40Updated 2 years ago
- β33Updated 2 years ago
- Multi-Domain Expert Learningβ66Updated last year
- Simple Annotated implementation of GPT-NeoX in PyTorchβ110Updated 3 years ago
- A library for squeakily cleaning and filtering language datasets.β47Updated 2 years ago
- 4 bits quantization of SantaCoder using GPTQβ50Updated 2 years ago
- The GeoV model is a large langauge model designed by Georges Harik and uses Rotary Positional Embeddings with Relative distances (RoPER).β¦β121Updated 2 years ago