A quick guide (especially) for trending instruction finetuning datasets
☆3,381Nov 28, 2023Updated 2 years ago
Alternatives and similar repositories for LLMDataHub
Users that are interested in LLMDataHub are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Curated list of datasets and tools for post-training.☆4,500Apr 27, 2026Updated last week
- A framework for few-shot evaluation of language models.☆12,411Updated this week
- Train transformer language models with reinforcement learning.☆18,282Updated this week
- Robust recipes to align language models with human and AI preferences☆5,593Apr 8, 2026Updated 3 weeks ago
- Instruction Tuning with GPT-4☆4,337Jun 11, 2023Updated 2 years ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)☆70,777Updated this week
- Tools for merging pretrained large language models.☆7,052Mar 15, 2026Updated last month
- An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.☆39,463Updated this week
- Aligning pretrained language models with instruction data generated by themselves.☆4,595Mar 27, 2023Updated 3 years ago
- Summarize existing representative LLMs text datasets.☆1,461Mar 11, 2026Updated last month
- Awesome-LLM: a curated list of Large Language Model☆26,735Jul 31, 2025Updated 9 months ago
- A collection of open-source dataset to train instruction-following LLMs (ChatGPT,LLaMA,Alpaca)☆1,147Jan 4, 2024Updated 2 years ago
- 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.☆21,052Updated this week
- LLMs build upon Evol Insturct: WizardLM, WizardCoder, WizardMath☆9,481Jun 7, 2025Updated 10 months ago
- 1-Click AI Models by DigitalOcean Gradient • AdDeploy popular AI models on DigitalOcean Gradient GPU virtual machines with just a single click. Zero configuration with optimized deployments.
- Go ahead and axolotl questions