k9ele7en / Triton-TensorRT-Inference-CRAFT-pytorchView on GitHub
Advanced inference pipeline using NVIDIA Triton Inference Server for CRAFT Text detection (Pytorch), included converter from Pytorch -> ONNX -> TensorRT, Inference pipelines (TensorRT, Triton server - multi-format). Supported model format for Triton inference: TensorRT engine, Torchscript, ONNX
33Aug 18, 2021Updated 4 years ago

Alternatives and similar repositories for Triton-TensorRT-Inference-CRAFT-pytorch

Users that are interested in Triton-TensorRT-Inference-CRAFT-pytorch are comparing it to the libraries listed below

Sorting:

Are these results useful?