ZBayes / poc_projectLinks
通用简单工具项目
☆22Updated last year
Alternatives and similar repositories for poc_project
Users that are interested in poc_project are comparing it to the libraries listed below
Sorting:
- LLM for NER☆80Updated last year
- 介绍docker、docker compose的使用。☆21Updated last year
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆55Updated 2 years ago
- 基于python的BM25文本匹配算法实现☆33Updated 3 years ago
- 2023全球智能汽车AI挑战赛——赛道一:AI大模型检索问答, 75+ baseline☆60Updated 2 years ago
- BLOOM 模型的指令微调☆24Updated 2 years ago
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆79Updated 2 years ago
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆110Updated 2 years ago
- ☆41Updated 10 months ago
- A Challenge on Dialog Systems with Retrieval Augmented Generation (FutureDial-RAG), Co-located with SLT2024 FutureDial-RAG Challenge☆11Updated last year
- 文本智能校对大赛(Chinese Text Correction)的baseline☆66Updated 3 years ago
- 用于AIOPS24挑战赛的Demo☆64Updated last year
- GoGPT中文指令数据集构造☆10Updated 2 years ago
- 阿里天池: 2023全球智能汽车AI挑战赛——赛道一:AI大模型检索问答 baseline 80+☆120Updated 2 years ago
- ☆23Updated 2 years ago
- 大型语言模型实战指南:应用实践与场景落地☆87Updated last year
- 专注于中文领域大语言模型,落地到某个行业某个领域,成为一个行业大模型、公司级别或行业级别领域大模型。☆126Updated 11 months ago
- LLM+RAG for QA☆22Updated 2 years ago
- A repo for update and debug Mixtral-7x8B、MOE、ChatGLM3、LLaMa2、 BaChuan、Qwen an other LLM models include new models mixtral, mixtral 8x7b, …☆47Updated 4 months ago
- ChatGLM2-6B-Explained☆36Updated 2 years ago
- ☆57Updated 3 years ago
- meta-comprehensive-rag-benchmark-kdd-cup-2024 phase1 task1 rank3☆21Updated last year
- basic framework for rag(retrieval augment generation)☆86Updated 2 years ago
- llama信息抽取实战☆102Updated 2 years ago
- 天池算法比赛《BetterMixture - 大模型数据混 合挑战赛》的第一名top1解决方案☆34Updated last year
- 基于pytorch的百度UIE命名实体识别。☆57Updated 3 years ago
- ☆19Updated last year
- 集成Qwen与DeepSeek等先进大语言模型,支持纯LLM+分类层模式及LLM+LoRA+分类层模式,使用transformers模块化设计和训练便于根据需要调整或替换组件。☆19Updated 5 months ago
- 大语言模型指令调优工具(支持 FlashAttention)☆177Updated 2 years ago
- The LLM of NL2GQL with NebulaGraph or Neo4j☆97Updated 2 years ago