microsoft / GODELLinks
Large-scale pretrained models for goal-directed dialog
☆888Updated 2 years ago
Alternatives and similar repositories for GODEL
Users that are interested in GODEL are comparing it to the libraries listed below
Sorting:
- Open-source pre-training implementation of Google's LaMDA in PyTorch. Adding RLHF similar to ChatGPT.☆470Updated last year
- Crosslingual Generalization through Multitask Finetuning☆537Updated last year
- Large-scale pretraining for dialogue☆2,419Updated 3 years ago
- LaMini-LM: A Diverse Herd of Distilled Models from Large-Scale Instructions☆823Updated 2 years ago
- Repo for fine-tuning Casual LLMs☆457Updated last year
- An open-source implementation of Google's PaLM models☆820Updated last year
- Alpaca dataset from Stanford, cleaned and curated☆1,581Updated 2 years ago
- A collection of modular datasets generated by GPT-4, General-Instruct - Roleplay-Instruct - Code-Instruct - and Toolformer☆1,629Updated 2 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways☆828Updated 3 years ago
- This repository contains code for extending the Stanford Alpaca synthetic instruction tuning to existing instruction-tuned models such as…☆358Updated 2 years ago
- Fast Inference Solutions for BLOOM☆566Updated last year
- A dataset containing human-human knowledge-grounded open-domain conversations.☆670Updated last year
- Expanding natural instructions☆1,029Updated 2 years ago
- Tune any FALCON in 4-bit☆463Updated 2 years ago
- ☆1,560Updated this week
- ☆1,634Updated 2 years ago
- SGPT: GPT Sentence Embeddings for Semantic Search☆873Updated last year
- Salesforce open-source LLMs with 8k sequence length.☆723Updated last year
- A method to fix GPT-3 after deployment with user feedback, without re-training.☆331Updated 2 years ago
- Ongoing research training transformer models at scale☆395Updated last year
- Ask Me Anything language model prompting☆548Updated 2 years ago
- Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.☆540Updated 3 weeks ago
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆434Updated 2 years ago
- A tiny library for coding with large language models.☆1,233Updated last year
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆465Updated 3 years ago
- ChatLLaMA 📢 Open source implementation for LLaMA-based ChatGPT runnable in a single GPU. 15x faster training process than ChatGPT☆1,203Updated last year
- A search engine for ParlAI's BlenderBot project (and probably other ones as well)☆130Updated 4 years ago
- Dromedary: towards helpful, ethical and reliable LLMs.☆1,143Updated 4 months ago
- ☆457Updated 2 years ago
- simpleT5 is built on top of PyTorch-lightning⚡️ and Transformers🤗 that lets you quickly train your T5 models.☆400Updated 2 years ago