Azure99 / BlossomDataLinks
A fluent, scalable, and easy-to-use LLM data processing framework.
☆24Updated 2 weeks ago
Alternatives and similar repositories for BlossomData
Users that are interested in BlossomData are comparing it to the libraries listed below
Sorting:
- Generate multi-round conversation roleplay data based on self-instruct and evol-instruct.☆134Updated 7 months ago
- Imitate OpenAI with Local Models☆88Updated 11 months ago
- Mixture-of-Experts (MoE) Language Model☆189Updated 11 months ago
- SUS-Chat: Instruction tuning done right☆49Updated last year
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆139Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆131Updated last year
- ☆231Updated last year
- ☆94Updated 8 months ago
- Qwen-WisdomVast is a large model trained on 1 million high-quality Chinese multi-turn SFT data, 200,000 English multi-turn SFT data, and …☆18Updated last year
- ☆49Updated last year
- GLM Series Edge Models☆147Updated last month
- 使用langchain进行任务规划,构建子任务的会话场景资源,通过MCTS任务执行器,来让每个子任务通过在上下文中资源,通过自身反思探索来获取自身对问题的最优答案;这种方式依赖模型的对齐偏好,我们在每种偏好上设计了一个工程框架,来完成自我对不同答案的奖励进行采样策略☆29Updated 3 weeks ago
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆263Updated last year
- Delta-CoMe can achieve near loss-less 1-bit compressin which has been accepted by NeurIPS 2024☆56Updated 8 months ago
- 最简易的R1结果在小模型上的复现,阐述类O1与DeepSeek R1最重要的本质。Think is all your need。利用实验佐证,对于强推理能力,think思考过程性内容是AGI/ASI的核心。☆44Updated 6 months ago
- Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper☆148Updated last year
- 我们是 第一个完全可商用的角色大模型。☆40Updated 11 months ago
- FuseAI Project☆87Updated 6 months ago
- ☆30Updated 11 months ago
- LongQLoRA: Extent Context Length of LLMs Efficiently☆166Updated last year
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆136Updated last year
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆61Updated 9 months ago
- Another ChatGLM2 implementation for GPTQ quantization☆54Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- ☆40Updated last year
- Evaluation for AI apps and agent☆42Updated last year
- ☆106Updated last year
- ☆27Updated 9 months ago
- ☆144Updated last year