boom-R123 / ChatWK
Usings LLM chat with knowledges
☆20Updated last year
Alternatives and similar repositories for ChatWK:
Users that are interested in ChatWK are comparing it to the libraries listed below
- ☆23Updated last year
- moss chat finetuning☆50Updated last year
- MEASURING MASSIVE MULTITASK CHINESE UNDERSTANDING☆87Updated last year
- use chatGLM to perform text embedding☆45Updated 2 years ago
- deepspeed+trainer简单高效实现多卡微调大模型☆125Updated last year
- 中文原生检索增强生成测评基准☆115Updated last year
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆107Updated last year
- The LLM of NL2GQL with NebulaGraph or Neo4j☆92Updated last year
- 支持ChatGLM2 lora微调☆40Updated last year
- 骆驼QA,中文大语言阅读理解模型。☆74Updated last year
- ☆160Updated 2 years ago
- LLM with LuXun (鲁迅) style☆84Updated last year
- large language model training-3-stages+deployment☆47Updated last year
- 国内首个全参数训练的法律大模型 HanFei-1.0 (韩非)☆116Updated last year
- 实现了Baichuan-Chat微调,Lora、QLora等各种微调方式,一键运行。☆70Updated last year
- make LLM easier to use☆59Updated last year
- ☆43Updated last year
- ☆172Updated 2 years ago
- llama信息抽取实战☆100Updated 2 years ago
- 用于微调LLM的中文指令数据集☆26Updated 2 years ago
- 中文大语言模型评测第二期☆70Updated last year
- GoGPT:基于Llama/Llama 2训练的中英文增强大模型|Chinese-Llama2☆78Updated last year
- ☆40Updated 2 months ago
- 怎么训练一个LLM分词器☆144Updated last year
- 旨在对当前主流LLM进行一个直观、具体、 标准的评测☆94Updated last year
- 基于 LoRA 和 P-Tuning v2 的 ChatGLM-6B 高效参数微调☆55Updated last year
- 一套代码指令微调大模型☆39Updated last year
- An open-source conversational language model developed by the Knowledge Works Research Laboratory at Fudan University.☆64Updated last year
- SuperCLUE-Agent: 基于中文原生任务的Agent智能体核心能力测评基准☆84Updated last year
- ☆11Updated last year