randombenj / detectron2onnx-inference

Export [detectron2](https://github.com/facebookresearch/detectron2) model to [onnx](https://github.com/onnx/onnx) and run inference using [caffe2 onnx backend](https://pytorch.org/tutorials/advanced/super_resolution_with_caffe2.html). This let's you run inference on a raspberry pi with acceptable inference times.
13Updated 3 years ago

Related projects

Alternatives and complementary repositories for detectron2onnx-inference