mallorbc / GPT_Neo_quotes_datasetLinks
☆27Updated 4 years ago
Alternatives and similar repositories for GPT_Neo_quotes_dataset
Users that are interested in GPT_Neo_quotes_dataset are comparing it to the libraries listed below
Sorting:
- ☆34Updated 4 years ago
- 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.☆55Updated 3 years ago
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)☆36Updated 4 years ago
- ☆63Updated 4 years ago
- ☆50Updated 2 years ago
- Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance☆28Updated 2 years ago
- ☆131Updated 3 years ago
- Notebook for running GPT neo models based on GPT3☆62Updated 4 years ago
- Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B☆60Updated 4 years ago
- Training & Implementation of chatbots leveraging GPT-like architecture with the aitextgen package to enable dynamic conversations.☆48Updated 3 years ago
- ☆21Updated 4 years ago
- ☆123Updated 2 years ago
- Code for OpenAI Whisper Web App Demo☆93Updated 3 years ago
- Fine-tuning GPT-J-6B on colab or equivalent PC GPU with your custom datasets: 8-bit weights with low-rank adaptors (LoRA)☆74Updated 3 years ago
- ☆64Updated 2 years ago
- A search engine for ParlAI's BlenderBot project (and probably other ones as well)☆130Updated 3 years ago
- ☆27Updated 2 years ago
- Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression☆68Updated 3 years ago
- Patch for MPT-7B which allows using and training a LoRA☆58Updated 2 years ago
- Repo for fine-tuning Casual LLMs☆455Updated last year
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 3 years ago
- ☆22Updated 2 years ago
- Just a repo with some AI Dungeon scripts☆30Updated 4 years ago
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆436Updated 2 years ago
- Experiments with generating opensource language model assistants☆97Updated 2 years ago
- A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model load…☆114Updated 3 years ago
- Smol but mighty language model☆61Updated 2 years ago
- Conversational Language model toolkit for training against human preferences.☆41Updated last year
- Drop in replacement for OpenAI, but with Open models.☆153Updated 2 years ago
- A ready-to-deploy container for implementing an easy to use REST API to access Language Models.☆66Updated 2 years ago