mallorbc / GPT_Neo_fine-tuning_notebookLinks
☆35Updated 3 years ago
Alternatives and similar repositories for GPT_Neo_fine-tuning_notebook
Users that are interested in GPT_Neo_fine-tuning_notebook are comparing it to the libraries listed below
Sorting:
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆437Updated last year
- ☆27Updated 3 years ago
- 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.☆56Updated 3 years ago
- A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.☆156Updated 2 years ago
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 2 years ago
- Fine-tuning GPT-J-6B on colab or equivalent PC GPU with your custom datasets: 8-bit weights with low-rank adaptors (LoRA)☆74Updated 2 years ago
- Notebook for running GPT neo models based on GPT3☆63Updated 3 years ago
- ☆50Updated 2 years ago
- Repo for fine-tuning Casual LLMs☆456Updated last year
- Hybrid Conversational Bot based on both neural retrieval and neural generative mechanism with TTS.☆83Updated last year
- Just a repo with some AI Dungeon scripts☆29Updated 3 years ago
- ☆130Updated 2 years ago
- ☆28Updated 2 years ago
- A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model load…☆115Updated 3 years ago
- ☆63Updated 3 years ago
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)☆36Updated 3 years ago
- Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression☆66Updated 2 years ago
- Training & Implementation of chatbots leveraging GPT-like architecture with the aitextgen package to enable dynamic conversations.☆49Updated 2 years ago
- ☆9Updated 3 years ago
- Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B☆63Updated 3 years ago
- Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance☆28Updated 2 years ago
- ai-content-gen-with-bloom☆41Updated 2 years ago
- 💭 Fine-tune a Covid-19 Doctor-like chatbot with GPT2☆51Updated 4 years ago
- ☆23Updated 2 years ago
- Reproducing "Writing with Transformer" demo, using aitextgen/FastAPI in backend, Quill/React in frontend☆28Updated 4 years ago
- A Streamlit app running GPT-2 language model for text classification, built with Pytorch, Transformers and AWS SageMaker.☆39Updated 3 years ago
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆536Updated last month
- This project is used to generate a blog post using Natural Language processing, Hugging Face Transformers and GPT-2 Model.☆17Updated 4 years ago
- A GPT-J API to use with python3 to generate text, blogs, code, and more☆204Updated 2 years ago
- Conversational AI tooling & personas built on Cohere's LLMs☆174Updated last year