CrazyBoyM / CodeLLaMA-chatLinks
CodeLLaMA 中文版 - 代码生成助手,huggingface累积下载2w+次
☆45Updated last year
Alternatives and similar repositories for CodeLLaMA-chat
Users that are interested in CodeLLaMA-chat are comparing it to the libraries listed below
Sorting:
- zero零训练llm调参☆32Updated 2 years ago
- ☆106Updated last year
- ☆94Updated 8 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆132Updated last year
- ☆23Updated 4 months ago
- Mixture-of-Experts (MoE) Language Model☆189Updated 11 months ago
- XVERSE-65B: A multilingual large language model developed by XVERSE Technology Inc.☆141Updated last year
- Fast LLM Training CodeBase With dynamic strategy choosing [Deepspeed+Megatron+FlashAttention+CudaFusionKernel+Compiler];☆41Updated last year
- Light local website for displaying performances from different chat models.☆87Updated last year
- SUS-Chat: Instruction tuning done right☆49Updated last year
- ☆64Updated last year
- The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.☆69Updated 2 years ago
- AGM阿格姆:AI基因图谱模型,从token-weight权重微粒角度,探索AI模型,GPT\LLM大模型的内在运作机制。☆29Updated 2 years ago
- Its an open source LLM based on MOE Structure.☆58Updated last year
- Imitate OpenAI with Local Models☆89Updated last year
- Another ChatGLM2 implementation for GPTQ quantization☆55Updated last year
- the newest version of llama3,source code explained line by line using Chinese☆22Updated last year
- a Fine-tuned LLaMA that is Good at Arithmetic Tasks☆177Updated last year
- The official codes for "Aurora: Activating chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning"☆265Updated last year
- ☆124Updated last year
- Silk Road will be the dataset zoo for Luotuo(骆驼). Luotuo is an open sourced Chinese-LLM project founded by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子…☆40Updated last year
- Leveraging large language models for text-to-SQL synthesis, this project fine-tunes WizardLM/WizardCoder-15B-V1.0 with QLoRA on a custom …☆44Updated last year
- ☆30Updated last year
- ⚡ boost inference speed of GPT models in transformers by onnxruntime☆53Updated 2 years ago
- Open efforts to implement ChatGPT-like models and beyond.☆109Updated last year
- LongQLoRA: Extent Context Length of LLMs Efficiently☆166Updated last year
- Unleashing the Power of Cognitive Dynamics on Large Language Models☆63Updated 11 months ago
- Delta-CoMe can achieve near loss-less 1-bit compressin which has been accepted by NeurIPS 2024☆56Updated 9 months ago
- 全球首个StableVicuna中文优化版。☆63Updated 2 years ago
- fastertransformer for codegeex model☆64Updated 2 years ago