arrmansa / Basic-UI-for-GPT-J-6B-with-low-vramLinks
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
☆115Updated 3 years ago
Alternatives and similar repositories for Basic-UI-for-GPT-J-6B-with-low-vram
Users that are interested in Basic-UI-for-GPT-J-6B-with-low-vram are comparing it to the libraries listed below
Sorting:
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)☆36Updated 4 years ago
- Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B☆62Updated 3 years ago
- 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.☆56Updated 3 years ago
- Tools with GUI for GPT finetune data preparation☆23Updated 3 years ago
- Just a repo with some AI Dungeon scripts☆29Updated 3 years ago
- A GPT-J API to use with python3 to generate text, blogs, code, and more☆204Updated 2 years ago
- ☆27Updated 2 years ago
- A gradio web UI for running Large Language Models like GPT-J 6B, OPT, GALACTICA, LLaMA, and Pygmalion.☆309Updated last year
- Framework agnostic python runtime for RWKV models☆146Updated last year
- ☆130Updated 3 years ago
- A latent text-to-image diffusion model☆67Updated 2 years ago
- ☆54Updated 2 years ago
- ☆158Updated last year
- Discord AI Generation Bot to collect an aesthetic rating dataset☆60Updated 2 years ago
- An implementation of a server for the Stability AI Stable Diffusion API☆173Updated 2 years ago
- ☆9Updated 3 years ago
- An attempt to create an open-source AI companion that is self-hostable☆80Updated 2 years ago
- A ready-to-deploy container for implementing an easy to use REST API to access Language Models.☆64Updated 2 years ago
- Frontend for deeplearning Image generation☆150Updated last year
- ☆132Updated 2 years ago
- Conversational Language model toolkit for training against human preferences.☆41Updated last year
- Doohickey is a stable diffusion tool for technical artists who want to stay up-to-date with the latest developments in the field.☆40Updated 2 years ago
- Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance☆28Updated 2 years ago
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 2 years ago
- Landmark Attention: Random-Access Infinite Context Length for Transformers QLoRA☆123Updated 2 years ago
- A KoboldAI-like memory extension for oobabooga's text-generation-webui☆108Updated 7 months ago
- Text WebUI extension to add clever Notebooks to Chat mode☆140Updated last week
- extending stable diffusion prompts with suitable style cues using text generation☆176Updated 2 years ago
- Just a simple HowTo for https://github.com/johnsmith0031/alpaca_lora_4bit☆31Updated 2 years ago
- Fork of kingoflolz/mesh-transformer-jax with memory usage optimizations and support for GPT-Neo, GPT-NeoX, BLOOM, OPT and fairseq dense L…☆22Updated 2 years ago