wangitu / unpadded-AutoGPTQLinks
Enhanced version of original AutoGPTQ (https://github.com/PanQiWei/AutoGPTQ).
☆10Updated last year
Alternatives and similar repositories for unpadded-AutoGPTQ
Users that are interested in unpadded-AutoGPTQ are comparing it to the libraries listed below
Sorting:
- ☆172Updated 2 years ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆136Updated 6 months ago
- ☆162Updated 2 years ago
- 语言模型中文认知能力分析☆236Updated last year
- 大语言模型指令调优工具(支持 FlashAttention)☆173Updated last year
- ☆128Updated 2 years ago
- 本项目旨在对大量文本文件进行快速编码检测和转换以辅助mnbvc语料集项目的数据清洗工作☆61Updated 7 months ago
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆263Updated last year
- ☆308Updated 2 years ago
- ☆169Updated last year
- 文本去重☆72Updated last year
- bge推理优化相关脚本☆28Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- train llama on a single A100 80G node using 🤗 transformers and 🚀 Deepspeed Pipeline Parallelism☆220Updated last year
- 中文 Instruction tuning datasets☆131Updated last year
- deep learning☆149Updated last month
- 实现了Baichuan-Chat微调,Lora、QLora等各种微调方式,一键运行。☆70Updated last year
- T2Ranking: A large-scale Chinese benchmark for passage ranking.☆158Updated last year
- GTS Engine: A powerful NLU Training System。GTS引擎(GTS-Engine)是一款开箱即 用且性能强大的自然语言理解引擎,聚焦于小样本任务,能够仅用小样本就能自动化生产NLP模型。☆91Updated 2 years ago
- 使用qlora对中文大语言模型进行微调,包含ChatGLM、Chinese-LLaMA-Alpaca、BELLE☆86Updated last year
- Simple implementation of using lora form the peft library to fine-tune the chatglm-6b☆83Updated 2 years ago
- 中文图书语料MD5链接☆218Updated last year
- Dataset and evaluation script for "Evaluating Hallucinations in Chinese Large Language Models"☆128Updated last year
- ChatGLM2-6B微调, SFT/LoRA, instruction finetune☆108Updated last year
- A high-throughput and memory-efficient inference and serving engine for LLMs☆131Updated 11 months ago
- ☆84Updated last year
- ☆228Updated last year
- NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model☆74Updated 2 years ago
- 中文通 用大模型开放域多轮测评基准 | An Open Domain Benchmark for Foundation Models in Chinese☆79Updated last year
- llama inference for tencentpretrain☆98Updated 2 years ago