mallorbc / GPT_Neo_fine-tuning_notebook
☆34Updated 3 years ago
Alternatives and similar repositories for GPT_Neo_fine-tuning_notebook:
Users that are interested in GPT_Neo_fine-tuning_notebook are comparing it to the libraries listed below
- ☆27Updated 3 years ago
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆437Updated last year
- Fine-tuning GPT-J-6B on colab or equivalent PC GPU with your custom datasets: 8-bit weights with low-rank adaptors (LoRA)☆74Updated 2 years ago
- Notebook for running GPT neo models based on GPT3☆63Updated 3 years ago
- Training & Implementation of chatbots leveraging GPT-like architecture with the aitextgen package to enable dynamic conversations.☆49Updated 2 years ago
- Hybrid Conversational Bot based on both neural retrieval and neural generative mechanism with TTS.☆82Updated last year
- Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression☆66Updated 2 years ago
- Repo for fine-tuning Casual LLMs☆454Updated last year
- Simple Annotated implementation of GPT-NeoX in PyTorch☆110Updated 2 years ago
- ☆130Updated 2 years ago
- 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.☆56Updated 3 years ago
- ☆62Updated 3 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- 💭 Fine-tune a Covid-19 Doctor-like chatbot with GPT2☆51Updated 4 years ago
- ☆23Updated 2 years ago
- ai-content-gen-with-bloom☆41Updated 2 years ago
- A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)☆36Updated 3 years ago
- A series of notebooks demonstrating how to build simple NLP web apps with Gradio and Hugging Face transformers☆45Updated 3 years ago
- ☆49Updated 2 years ago
- ☆32Updated 2 years ago
- GPT2Explorer is bringing GPT2 OpenAI langage models playground to run locally on standard windows computers.☆29Updated 2 years ago
- ☆9Updated 3 years ago
- simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.☆394Updated last year
- GPT-2 French demo | Démo française de GPT-2☆68Updated 4 years ago
- ☆28Updated last year
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆532Updated last month
- ☆58Updated 3 years ago
- Generate NFT or train new model in just few clicks! Train as much as you can, others will resume from checkpoint!☆152Updated 2 years ago
- ☆33Updated last year
- An example of multilingual machine translation using a pretrained version of mt5 from Hugging Face.☆42Updated 4 years ago