babylm / baseline-pretrainingLinks
Code for pre-training BabyLM baseline models.
☆15Updated 2 years ago
Alternatives and similar repositories for baseline-pretraining
Users that are interested in baseline-pretraining are comparing it to the libraries listed below
Sorting:
- Mamba training library developed by kotoba technologies☆71Updated last year
- Ongoing research training Mixture of Expert models.☆19Updated 10 months ago
- Support Continual pre-training & Instruction Tuning forked from llama-recipes☆32Updated last year
- ☆14Updated last year
- ☆61Updated last year
- 日本語マルチタスク言語理解ベンチマーク Japanese Massive Multitask Language Understanding Benchmark☆36Updated 7 months ago
- ☆49Updated last year
- CycleQD is a framework for parameter space model merging.☆40Updated 5 months ago
- ☆17Updated 7 months ago
- Checkpointable dataset utilities for foundation model training☆32Updated last year
- Ongoing Research Project for continaual pre-training LLM(dense mode)☆42Updated 4 months ago
- Japanese LLaMa experiment☆53Updated 7 months ago
- Swallowプロジェクト 大規模言語モデル 評価スクリプト☆19Updated 3 months ago
- ☆33Updated 11 months ago
- Code for Discovering Preference Optimization Algorithms with and for Large Language Models☆63Updated last year
- Example of using Epochraft to train HuggingFace transformers models with PyTorch FSDP☆11Updated last year
- ☆53Updated 7 months ago
- List of papers on Self-Correction of LLMs.☆73Updated 6 months ago
- Official implementation of "TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models"☆110Updated 5 months ago
- LEIA: Facilitating Cross-Lingual Knowledge Transfer in Language Models with Entity-based Data Augmentation☆21Updated last year
- Mixtral-based Ja-En (En-Ja) Translation model☆19Updated 6 months ago
- ☆16Updated 10 months ago
- Flexible evaluation tool for language models☆49Updated last week
- Code for "Unlearning Traces the Influential Training Data of Language Models"☆12Updated last year
- The robust text processing pipeline framework enabling customizable, efficient, and metric-logged text preprocessing.☆122Updated this week
- ☆74Updated last year
- [ICLR 2025] SDTT: a simple and effective distillation method for discrete diffusion models☆29Updated 3 months ago
- Aligned, Review-Informed Edits of Scientific Papers☆53Updated 2 years ago
- ☆22Updated last year
- Project of llm evaluation to Japanese tasks☆84Updated last week