MooreThreads / vllm_musa
A high-throughput and memory-efficient inference and serving engine for LLMs
☆49Updated 6 months ago
Alternatives and similar repositories for vllm_musa
Users that are interested in vllm_musa are comparing it to the libraries listed below
Sorting:
- run ChatGLM2-6B in BM1684X☆49Updated last year
- DashInfer is a native LLM inference engine aiming to deliver industry-leading performance atop various hardware architectures, including …☆251Updated this week
- PaddlePaddle custom device implementaion. (『飞桨』自定义硬件接入实现)☆83Updated this week
- ☆162Updated last month
- Triton Documentation in Chinese Simplified / Triton 中文文档☆71Updated last month
- vLLM Documentation in Chinese Simplified / vLLM 中文文档☆69Updated this week
- Run generative AI models in sophgo BM1684X/BM1688☆208Updated last week
- torch_musa is an open source repository based on PyTorch, which can make full use of the super computing power of MooreThreads graphics c…☆395Updated last week
- Ascend PyTorch adapter (torch_npu). Mirror of https://gitee.com/ascend/pytorch☆354Updated this week
- 支持中文场景的的小语言模型 llama2.c-zh☆147Updated last year
- Explore LLM model deployment based on AXera's AI chips☆103Updated this week
- run DeepSeek-R1 GGUFs on KTransformers☆227Updated 2 months ago
- LLM101n: Let's build a Storyteller 中文版☆132Updated 9 months ago
- ☆44Updated 6 months ago
- run chatglm3-6b in BM1684X☆38Updated last year
- llm-export can export llm model to onnx.☆289Updated 4 months ago
- ☆310Updated 5 months ago
- ☆127Updated 4 months ago
- 基于MNN-llm的安卓手机部署大语言模型:Qwen1.5-0.5B-Chat☆77Updated last year
- ☆27Updated 6 months ago
- export llama to onnx☆124Updated 4 months ago
- Compare different hardware platforms via the Roofline Model for LLM inference tasks.☆100Updated last year
- Community maintained hardware plugin for vLLM on Ascend☆631Updated this week
- 使用 CUDA C++ 实现的 llama 模型推理框架☆56Updated 6 months ago
- C++ implementation of Qwen-LM☆587Updated 5 months ago
- ☆45Updated last year
- 大模型部署实战:TensorRT-LLM, Triton Inference Server, vLLM☆26Updated last year
- ☆139Updated last year
- ☆48Updated last week
- ☆90Updated last year