harleyszhang / lite_llamaLinks
A light llama-like llm inference framework based on the triton kernel.
☆147Updated 3 weeks ago
Alternatives and similar repositories for lite_llama
Users that are interested in lite_llama are comparing it to the libraries listed below
Sorting:
- ☆35Updated 3 months ago
- 使用 CUDA C++ 实现的 llama 模型推理框架☆60Updated 9 months ago
- learning how CUDA works☆304Updated 5 months ago
- llm theoretical performance analysis tools and support params, flops, memory and latency analysis.☆103Updated last month
- A CUDA tutorial to make people learn CUDA program from 0☆248Updated last year
- 校招、秋招、春招、实习好项目,带你从零动手实现支持LLama2/3和Qwen2.5的大模型推理框架。☆412Updated last month
- how to learn PyTorch and OneFlow☆449Updated last year
- A tutorial for CUDA&PyTorch☆154Updated 7 months ago
- b站上的课程☆75Updated 2 years ago
- ☆138Updated last year
- ☆25Updated 2 weeks ago
- CUDA 算子手撕与面试指南☆550Updated this week
- 🤖FFPA: Extend FlashAttention-2 with Split-D, ~O(1) SRAM complexity for large headdim, 1.8x~3x↑🎉 vs SDPA EA.☆211Updated 2 weeks ago
- Examples of CUDA implementations by Cutlass CuTe☆219Updated last month
- ☆290Updated 10 months ago
- Optimize softmax in triton in many cases☆21Updated 11 months ago
- Implement custom operators in PyTorch with cuda/c++☆69Updated 2 years ago
- Some common CUDA kernel implementations (Not the fastest).☆24Updated last week
- CPU Memory Compiler and Parallel programing☆26Updated 9 months ago
- flash attention tutorial written in python, triton, cuda, cutlass☆408Updated 3 months ago
- A minimalist and extensible PyTorch extension for implementing custom backend operators in PyTorch.☆33Updated last year
- ☆59Updated 9 months ago
- 分层解耦的深度学习推理引擎☆75Updated 6 months ago
- Triton Documentation in Chinese Simplified / Triton 中文文档☆80Updated 4 months ago
- ☆150Updated 7 months ago
- EasyNN是一个面向教学而开发的神经网络推理框架,旨在让大家0基础也能自主完成推理框架编写!☆32Updated last year
- ☆100Updated 3 months ago
- ☆130Updated 8 months ago
- 使用 cutlass 实现 flash-attention 精简版,具有教学意义☆46Updated last year
- 📚200+ Tensor/CUDA Cores Kernels, ⚡️flash-attn-mma, ⚡️hgemm with WMMA, MMA and CuTe (98%~100% TFLOPS of cuBLAS/FA2 🎉🎉).☆39Updated 4 months ago