yiyepiaoling0715 / codellm-data-preprocess-pipelineLinks
代码大模型 预训练&微调&DPO 数据处理 业界处理pipeline sota
☆46Updated last year
Alternatives and similar repositories for codellm-data-preprocess-pipeline
Users that are interested in codellm-data-preprocess-pipeline are comparing it to the libraries listed below
Sorting:
- WritingBench: A Comprehensive Benchmark for Generative Writing☆143Updated last week
- ☆50Updated last year
- [ACL 2024 Demo] Official GitHub repo for UltraEval: An open source framework for evaluating foundation models.☆253Updated last year
- ☆147Updated last year
- 怎么训练一个LLM分词器☆154Updated 2 years ago
- CLongEval: A Chinese Benchmark for Evaluating Long-Context Large Language Models☆45Updated last year
- ☆40Updated last year
- ☆98Updated last year
- Heuristic filtering framework for RefineCode☆82Updated 9 months ago
- Llama-3-SynE: A Significantly Enhanced Version of Llama-3 with Advanced Scientific Reasoning and Chinese Language Capabilities | 继续预训练提升 …☆36Updated 6 months ago
- Imitate OpenAI with Local Models☆89Updated last year
- ☆181Updated 2 years ago
- [EMNLP 2024] LongAlign: A Recipe for Long Context Alignment of LLMs☆257Updated 11 months ago
- SuperCLUE-Agent: 基于中文原生任务的Agent智能体核心能力测评基准☆94Updated 2 years ago
- ☆125Updated last year
- ☆146Updated last year
- [ACL 2024] MT-Bench-101: A Fine-Grained Benchmark for Evaluating Large Language Models in Multi-Turn Dialogues☆130Updated last year
- LLaMA Factory Document☆159Updated last week
- NaturalCodeBench (Findings of ACL 2024)☆68Updated last year
- Clustering and Ranking: Diversity-preserved Instruction Selection through Expert-aligned Quality Estimation☆90Updated last year
- ☆98Updated 2 years ago
- ☆83Updated last year
- a-m-team's exploration in large language modeling☆194Updated 6 months ago
- ☆51Updated last year
- Official Repository for SIGIR2024 Demo Paper "An Integrated Data Processing Framework for Pretraining Foundation Models"☆84Updated last year
- ☆77Updated 10 months ago
- ☆122Updated last year
- Dataset and evaluation script for "Evaluating Hallucinations in Chinese Large Language Models"☆136Updated last year
- Scripts of LLM pre-training and fine-tuning (w/wo LoRA, DeepSpeed)☆86Updated last year
- Pretrain、decay、SFT a CodeLLM from scratch 🧙♂️☆39Updated last year