wangzyon / trt_learnLinks
TensorRT encapsulation, learn, rewrite, practice.
☆28Updated 2 years ago
Alternatives and similar repositories for trt_learn
Users that are interested in trt_learn are comparing it to the libraries listed below
Sorting:
- ☆30Updated 9 months ago
- ☆26Updated 2 years ago
- EasyNN是一个面向教学而开发的神经网络推理框架,旨在让大家0基础也能自主完成推理框架编写!☆32Updated 11 months ago
- 使用 CUDA C++ 实现的 llama 模型推理框架☆60Updated 9 months ago
- Quick and Self-Contained TensorRT Custom Plugin Implementation and Integration☆68Updated 3 months ago
- Speed up image preprocess with cuda when handle image or tensorrt inference☆76Updated 3 weeks ago
- 该代码与B站上的视频 https://www.bilibili.com/video/BV18L41197Uz/?spm_id_from=333.788&vd_source=eefa4b6e337f16d87d87c2c357db8ca7 相关联。☆70Updated last year
- An onnx-based quantitation tool.☆71Updated last year
- ☆37Updated 10 months ago
- A light llama-like llm inference framework based on the triton kernel.☆147Updated 2 weeks ago
- b站上的课程☆75Updated 2 years ago
- ☆10Updated last year
- Serving Inside Pytorch☆162Updated 2 weeks ago
- llm deploy project based onnx.☆43Updated 10 months ago
- A large number of cuda/tensorrt cases . 大量案例来学习cuda/tensorrt☆145Updated 3 years ago
- ☆20Updated last year
- A Toolkit to Help Optimize Large Onnx Model☆158Updated last year
- Using pattern matcher in onnx model to match and replace subgraphs.☆81Updated last year
- Llama3 Streaming Chat Sample☆22Updated last year
- async inference for machine learning model☆26Updated 2 years ago
- Large Language Model Onnx Inference Framework☆36Updated 7 months ago
- A bunch of coding tutorials for my Youtube videos on Neural Network Quantization.☆18Updated last year
- This is a repository to practice multi-thread programming in C++☆25Updated last year
- simplify >2GB large onnx model☆63Updated 8 months ago
- 使用 cutlass 实现 flash-attention 精简版,具有教学意义☆46Updated last year
- ☆35Updated 3 months ago
- ☆42Updated last year
- ☆12Updated 5 months ago
- Awesome code, projects, books, etc. related to CUDA☆23Updated last week
- ☆47Updated 2 years ago