shiyemin / light-hf-proxy
A light proxy solution for HuggingFace hub.
☆47Updated last year
Alternatives and similar repositories for light-hf-proxy:
Users that are interested in light-hf-proxy are comparing it to the libraries listed below
- Light local website for displaying performances from different chat models.☆86Updated last year
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆69Updated last year
- ☆36Updated 8 months ago
- SUS-Chat: Instruction tuning done right☆48Updated last year
- 基于baichuan-7b的开源多模态大语言模型☆73Updated last year
- A Simple MLLM Surpassed QwenVL-Max with OpenSource Data Only in 14B LLM.☆37Updated 7 months ago
- code for Scaling Laws of RoPE-based Extrapolation☆73Updated last year
- Our 2nd-gen LMM☆33Updated 11 months ago
- ☆78Updated last year
- 1.4B sLLM for Chinese and English - HammerLLM🔨☆44Updated last year
- SkyScript-100M: 1,000,000,000 Pairs of Scripts and Shooting Scripts for Short Drama: https://arxiv.org/abs/2408.09333v2☆119Updated 5 months ago
- Gaokao Benchmark for AI☆108Updated 2 years ago
- A more efficient GLM implementation!☆55Updated 2 years ago
- This is a personal reimplementation of Google's Infini-transformer, utilizing a small 2b model. The project includes both model and train…☆56Updated last year
- XVERSE-MoE-A4.2B: A multilingual large language model developed by XVERSE Technology Inc.☆38Updated 11 months ago
- AGM阿格姆:AI基因图谱模型,从token-weight权重微粒角度,探索AI模型,GPT\LLM大模型的内在运作机制。☆28Updated last year
- Chinese CLIP models with SOTA performance.☆55Updated last year
- the newest version of llama3,source code explained line by line using Chinese☆22Updated last year
- GLM Series Edge Models☆137Updated 2 months ago
- zero零训练llm调参☆31Updated last year
- Fast LLM Training CodeBase With dynamic strategy choosing [Deepspeed+Megatron+FlashAttention+CudaFusionKernel+Compiler];☆37Updated last year
- Silk Road will be the dataset zoo for Luotuo(骆驼). Luotuo is an open sourced Chinese-LLM project founded by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子…☆38Updated last year
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆19Updated last year
- Its an open source LLM based on MOE Structure.☆58Updated 10 months ago
- ☆29Updated 8 months ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆140Updated last year
- SuperCLUE-Math6:新一代中文原生多轮多步数学推理数据集的探索之旅☆54Updated last year
- kimi-chat 测试数据☆7Updated last year
- Imitate OpenAI with Local Models☆88Updated 8 months ago
- Skywork-MoE: A Deep Dive into Training Techniques for Mixture-of-Experts Language Models☆132Updated 10 months ago