yann168 / boshi-sample-solutionLinks
☆15Updated 2 years ago
Alternatives and similar repositories for boshi-sample-solution
Users that are interested in boshi-sample-solution are comparing it to the libraries listed below
Sorting:
- 通用版面分析 | 中文文档解析 |Document Layout Analysis | layout paser☆47Updated last year
- ☆106Updated 2 years ago
- 一套代码指令微调大模型☆38Updated 2 years ago
- 用于AIOPS24挑战赛的Demo☆64Updated last year
- 百川Dynamic NTK-ALiBi的代码实现:无需微调即可推理更长文本☆49Updated 2 years ago
- KDD 2024 AQA competition 2nd place solution☆12Updated last year
- 中文原生检索增强生成测评基准☆123Updated last year
- ☆52Updated 10 months ago
- Recursive Abstractive Processing for Tree-Organized Retrieval☆10Updated last year
- the newest version of llama3,source code explained line by line using Chinese☆22Updated last year
- LLM+RAG for QA☆23Updated last year
- deepspeed+trainer简单高效实现多卡微调大模型☆129Updated 2 years ago
- 2020厦门国际银行数创金融杯建模大赛-优胜奖方案☆11Updated 4 years ago
- 1st Solution For Conversational Multi-Doc QA Workshop & International Challenge @ WSDM'24 - Xiaohongshu.Inc☆162Updated 4 months ago
- this repo is mnbvc text quality classification using fastText☆16Updated 2 years ago
- 中文金融大模型测评基准,六大类二十五任务、等级化评价,国内模型获得A级☆10Updated last year
- ☆19Updated last year
- ChatGLM2-6B-Explained☆36Updated 2 years ago
- CAIL 2023☆41Updated 2 years ago
- 大语言模型训练和服务调研☆36Updated 2 years ago
- Here is a demo for PDF parser (Including OCR, object detection tools)☆36Updated last year
- TianGong-AI-Unstructure☆69Updated last month
- 百度QA100万数据集☆47Updated 2 years ago
- use chatGLM to perform text embedding☆45Updated 2 years ago
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆110Updated 2 years ago
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆93Updated 2 years ago
- 大语言模型指令调优工具(支持 FlashAttention)☆178Updated last year
- large language model training-3-stages+deployment☆49Updated 2 years ago
- ☆15Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆19Updated 2 years ago