padeoe / hf-mirror-siteLinks
a huggingface mirror site.
☆326Updated last year
Alternatives and similar repositories for hf-mirror-site
Users that are interested in hf-mirror-site are comparing it to the libraries listed below
Sorting:
- 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)☆608Updated last year
- huggingface mirror download☆588Updated 9 months ago
- Yuan 2.0 Large Language Model☆690Updated last year
- hf-mirror-cli 使用国内镜像,无需配置开箱即用,快速下载hugingface上的模型☆150Updated 11 months ago
- The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.☆446Updated last year
- Phi3 中文后训练模型仓库☆324Updated last year
- ☆349Updated last year
- This is the first Chinese chat model specifically fine-tuned for Chinese through ORPO based on the Meta-Llama-3-8B-Instruct model.☆321Updated last year
- Llama3-Chinese是以Meta-Llama-3-8B为底座,使用 DORA + LORA+ 的训练方法,在50w高质量中文多轮SFT数据 + 10w英文多轮SFT数据 + 2000单轮自我认知数据训练而来的大模型。☆295Updated last year
- ☆457Updated 2 years ago
- 中文Mixtral-8x7B(Chinese-Mixtral-8x7B)☆656Updated last year
- ☆341Updated 3 months ago
- 【逐条处理完成】人为审核+修改每一条的弱智吧精选问题QA数据集☆241Updated 9 months ago
- 星辰语义大模型TeleChat2是由中国电信人工智能研究院研发训练的大语言模型,是首个完全国产算力训练并开源的千亿参数模型☆266Updated 5 months ago
- CMMLU: Measuring massive multitask language understanding in Chinese☆801Updated last year
- C++ implementation of Qwen-LM☆617Updated last year
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆265Updated last year
- Phi2-Chinese-0.2B 从0开始训练自己的Phi2中文小模型,支持接入langchain加载本地知识库做检索增强生成RAG。Training your own Phi2 small chat model from scratch.☆584Updated last year
- 360zhinao☆291Updated 8 months ago
- “百聆”是一个基于LLaMA的语言对齐增强的英语/中文大语言模型,具有优越的英语/中文能力,在多语言和通用任务等多项测试中取得ChatGPT 90%的性能。BayLing is an English/Chinese LLM equipped with advanced l…☆318Updated last year
- GAOKAO-Bench is an evaluation framework that utilizes GAOKAO questions as a dataset to evaluate large language models.☆699Updated last year
- Chinese Community for Google Gemma LLM☆63Updated last year
- Skywork series models are pre-trained on 3.2TB of high-quality multilingual (mainly Chinese and English) and code data. We have open-sour…☆1,470Updated 10 months ago
- SwanLab Self-hosted Service | SwanLab 私有化部署服务☆39Updated 2 weeks ago
- ☆975Updated 11 months ago
- OrionStar-Yi-34B-Chat 是一款开源中英文Chat模型,由猎户星空基于Yi-34B开源模型、使用15W+高质量语料微调而成。☆265Updated last year
- [EMNLP'24] CharacterGLM: Customizing Chinese Conversational AI Characters with Large Language Models☆489Updated 3 months ago
- ☆181Updated this week
- 更纯粹、更高压缩率的Tokenizer☆488Updated last year
- LLM Inference benchmark☆432Updated last year