Hi,

I have some self-trained computer vision models, trained with gluon and 
exported to symbol/params files. I'm looking for a possibility to run the 
models on the smallest possible edge device, so that I don't have to send the 
video feed to a server and do inference remote.

I would like to do inference on a ESP32 with camera using my own mxnet models. 
Has anyone experience with this? I found that Tensorflow is somehow supported 
on ESP32, but I could not find anything similar for mxnet or onnx. 

Best,
Blake





---
[Visit Topic](https://discuss.mxnet.apache.org/t/inference-on-esp32/6890/1) or 
reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.apache.org/email/unsubscribe/f0439c3e2e69483a88bba169206a8e1eb655140e189a1fad7fba33e1bf6de08e).

Reply via email to