arrmansa / Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
☆115Updated 3 years ago
Alternatives and similar repositories for Basic-UI-for-GPT-J-6B-with-low-vram:
Users that are interested in Basic-UI-for-GPT-J-6B-with-low-vram are comparing it to the libraries listed below
- Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B☆64Updated 3 years ago
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)☆36Updated 3 years ago
- Just a repo with some AI Dungeon scripts☆29Updated 3 years ago
- Tools with GUI for GPT finetune data preparation☆23Updated 3 years ago
- ☆130Updated 2 years ago
- A gradio web UI for running Large Language Models like GPT-J 6B, OPT, GALACTICA, LLaMA, and Pygmalion.☆310Updated last year
- A ready-to-deploy container for implementing an easy to use REST API to access Language Models.☆64Updated 2 years ago
- Framework agnostic python runtime for RWKV models☆146Updated last year
- Hidden Engrams: Long Term Memory for Transformer Model Inference☆35Updated 3 years ago
- Platform and API Agnostic library for powering chatbots☆24Updated 2 years ago
- NovelAI Research Tool and API implementations in Golang☆43Updated 2 years ago
- Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance☆28Updated 2 years ago
- A latent text-to-image diffusion model☆67Updated 2 years ago
- ☆54Updated 2 years ago
- Fine-tuning GPT-J-6B on colab or equivalent PC GPU with your custom datasets: 8-bit weights with low-rank adaptors (LoRA)☆74Updated 2 years ago
- Conversational Language model toolkit for training against human preferences.☆42Updated last year
- ☆242Updated 2 years ago
- A one-click version of sd-webui-colab☆162Updated 2 years ago
- A utility that downloads your Stable Diffusion images from discord and lets you preview them with Streamlit☆15Updated 2 years ago
- A simple set of commands to manage three levels of modified context and link world info where necessary.☆9Updated 3 years ago
- A GPT-J API to use with python3 to generate text, blogs, code, and more☆205Updated 2 years ago
- Discord AI Generation Bot to collect an aesthetic rating dataset☆60Updated 2 years ago
- A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.☆14Updated 3 years ago
- Landmark Attention: Random-Access Infinite Context Length for Transformers QLoRA☆123Updated last year
- A colab notebook that combines Stable Diffusion + DALL-E Mini (Craiyon)☆124Updated 2 years ago
- Oobabooga extension for Bark TTS☆118Updated last year
- ☆9Updated 3 years ago
- Home of `erlich` and `ongo`. Finetune latent-diffusion/glid-3-xl text2image on your own data.☆182Updated 2 years ago
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 2 years ago
- A KoboldAI-like memory extension for oobabooga's text-generation-webui☆108Updated 5 months ago