CyberCommy / baidu-qa-100wLinks
百度QA100万数据集
☆47Updated last year
Alternatives and similar repositories for baidu-qa-100w
Users that are interested in baidu-qa-100w are comparing it to the libraries listed below
Sorting:
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆68Updated 2 years ago
- ChatGLM-6B-Slim:裁减掉20K图片Token的ChatGLM-6B,完全一样的性能,占用更小的显存。☆127Updated 2 years ago
- A Python Package to Access World-Class Generative Models☆129Updated last year
- Gaokao Benchmark for AI☆109Updated 3 years ago
- SuperCLUE琅琊榜:中文通用大模型匿名对战评价基准☆145Updated last year
- Evaluation for AI apps and agent☆43Updated last year
- 首个llama2 13b 中文版模型 (Base + 中文对话SFT,实现流畅多轮人机自然语言交互)☆91Updated 2 years ago
- 大语言模型训练和服务调研☆36Updated 2 years ago
- 千问14B和7B的逐行解释☆63Updated 2 years ago
- Service for Bert model to Vector. 高效的文本转向量(Text-To-Vector)服务,支持GPU多卡、多worker、多客户端调用,开箱即用。☆12Updated 3 years ago
- XVERSE-7B: A multilingual large language model developed by XVERSE Technology Inc.☆53Updated last year
- Tracking the hot Github repos and update daily 每天自动追踪Github热门项目☆49Updated this week
- Finetune Llama 3, Mistral & Gemma LLMs 2-5x faster with 80% less memory☆29Updated last year
- Another ChatGLM2 implementation for GPTQ quantization☆53Updated 2 years ago
- Large-scale exact string matching tool☆17Updated 8 months ago
- ☆23Updated 2 years ago
- A light proxy solution for HuggingFace hub.☆46Updated 2 years ago
- 纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,☆45Updated 2 years ago
- SearchGPT: Building a quick conversation-based search engine with LLMs.☆46Updated 10 months ago
- Prompt 工程师利器,可同时比较多个 Prompts 在多个 LLM 模型上的效果☆96Updated 2 years ago
- ☆106Updated 2 years ago
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆93Updated 2 years ago
- large language model training-3-stages+deployment☆49Updated 2 years ago
- CodeGPT: A Code-Related Dialogue Dataset Generated by GPT and for GPT☆114Updated 2 years ago
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆39Updated last year
- 百度百科 500 万数据集☆42Updated last year
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- share data, prompt data , pretraining data☆36Updated last year
- ☆235Updated last year
- 用于微调LLM的中文指令数据集☆28Updated 2 years ago