rajeevsrao / TensorRTLinks
TensorRT is a C++ library for high performance inference on NVIDIA GPUs and deep learning accelerators.
☆19Updated last year
Alternatives and similar repositories for TensorRT
Users that are interested in TensorRT are comparing it to the libraries listed below
Sorting:
- stable diffusion, controlnet, tensorrt, accelerate☆57Updated 2 years ago
- https://wavespeed.ai/ Context parallel attention that accelerates DiT model inference with dynamic caching☆290Updated 3 weeks ago
- Experimental usage of stable-fast and TensorRT.☆207Updated 10 months ago
- Faster generation with text-to-image diffusion models.☆214Updated 7 months ago
- [NeurIPS 2024] AsyncDiff: Parallelizing Diffusion Models by Asynchronous Denoising