yuzhaouoe / pretraining-data-packingLinks
[ACL'24 Oral] Analysing The Impact of Sequence Composition on Language Model Pre-Training
☆23Updated last year
Alternatives and similar repositories for pretraining-data-packing
Users that are interested in pretraining-data-packing are comparing it to the libraries listed below
Sorting:
- ☆55Updated last year
- [ICLR'25] Data and code for our paper "Why Does the Effective Context Length of LLMs Fall Short?"☆78Updated last year
- [NeurIPS 2023] Repetition In Repetition Out: Towards Understanding Neural Text Degeneration from the Data Perspective☆39Updated 2 years ago
- Repository for "Propagating Knowledge Updates to LMs Through Distillation" (NeurIPS 2023).☆26Updated last year
- Repo for the paper "Large Language Models Struggle to Learn Long-Tail Knowledge"☆78Updated 2 years ago
- ☆89Updated 3 years ago
- GSM-Plus: Data, Code, and Evaluation for Enhancing Robust Mathematical Reasoning in Math Word Problems.☆64Updated last year
- "FiD-ICL: A Fusion-in-Decoder Approach for Efficient In-Context Learning" (ACL 2023)☆15Updated 2 years ago
- ☆23Updated 2 years ago
- ☆27Updated last year
- Official repository for ACL 2025 paper "Model Extrapolation Expedites Alignment"☆75Updated 8 months ago
- Code and data for paper "Context-faithful Prompting for Large Language Models".☆42Updated 2 years ago