randombenj / detectron2onnx-inference

Export [detectron2](https://github.com/facebookresearch/detectron2) model to [onnx](https://github.com/onnx/onnx) and run inference using [caffe2 onnx backend](https://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html). This let's you run inference on a raspberry pi with acceptable inference times.
15Updated 3 years ago

Alternatives and similar repositories for detectron2onnx-inference:

Users that are interested in detectron2onnx-inference are comparing it to the libraries listed below