Hi, this is a follow-up to a previous question of mine, I need to run a custom 
object detection module on Android.

I already fine-tuned the model (ssd_512_mobilenet1.0_custom) on a custom 
dataset, I tried running inference with this model (loading the .params file 
produced during the training) and everything works perfectly on my computer. 
Now, I need to export this to Android.

I was referring to [this 
answer](https://stackoverflow.com/questions/61607927/using-gluoncv-models-in-tensorflow-lite)
 to figure out the procedure, there are 3 suggested options:

1. You can use ONNX to convert models to other runtimes, for example [...] 
[NNAPI](https://github.com/JDAI-CV/DNNLibrary) for Android
2. You can use [TVM](https://docs.tvm.ai/deploy/android.html)
3. You can use SageMaker Neo + [DLR 
runtime](https://github.com/neo-ai/neo-ai-dlr) [...]


Regarding the first one, I converted my model to ONNX (thanks again to 
@waytrue17 for the help).
However, in order to use it with NNAPI, it is necessary to convert it to daq. 
In the repository, they provide a precomplied AppImage of onnx2daq to make the 
conversion, but the script returns an error. I checked the issues section, and 
[they report](https://github.com/JDAI-CV/DNNLibrary/issues/64) that "It 
actually fails for all onnx object detection models".


Then, I gave a try to DLR, since it's suggested to be the easiest way.
As I understand, in order to use my custom model with DLR, I would first need 
to compile it with TVM (which also covers the second point mentioned in the 
linked post). In the repo, they provide a Docker image with some conversion 
scripts for different frameworks.
I modified the 'compile_gluoncv.py' script, and now I have:

```
#!/usr/bin/env python3

from tvm import relay
import mxnet as mx
from mxnet.gluon.model_zoo.vision import get_model
from tvm_compiler_utils import tvm_compile

shape_dict = {'data': (1, 3, 300, 300)}
dtype='float32'
ctx = [mx.cpu(0)]

classes_custom = ["CML_mug"]
block = get_model('ssd_512_mobilenet1.0_custom', classes=classes_custom, 
pretrained_base=False, ctx=ctx)
block.load_parameters("ep_035.params", ctx=ctx) ### this is the file produced 
by training on the custom dataset


for arch in ["arm64-v8a", "armeabi-v7a", "x86_64", "x86"]:
  sym, params = relay.frontend.from_mxnet(block, shape=shape_dict, dtype=dtype)
  func = sym["main"]
  func = relay.Function(func.params, relay.nn.softmax(func.body), None, 
func.type_params, func.attrs)
  tvm_compile(func, params, arch, dlr_model_name)
```

However, when I run the script it returns the error:
```
ValueError: Model ssd_512_mobilenet1.0_custom is not supported. Available 
options are
        alexnet
        densenet121
        densenet161
        densenet169
        densenet201
        inceptionv3
        mobilenet0.25
        mobilenet0.5
        mobilenet0.75
        mobilenet1.0
        mobilenetv2_0.25
        mobilenetv2_0.5
        mobilenetv2_0.75
        mobilenetv2_1.0
        resnet101_v1
        resnet101_v2
        resnet152_v1
        resnet152_v2
        resnet18_v1
        resnet18_v2
        resnet34_v1
        resnet34_v2
        resnet50_v1
        resnet50_v2
        squeezenet1.0
        squeezenet1.1
        vgg11
        vgg11_bn
        vgg13
        vgg13_bn
        vgg16
        vgg16_bn
        vgg19
        vgg19_bn
```

Am I doing something wrong? Is this thing even possible?

As a side note, after this I'd need to deploy on Android a pose detection model 
(simple_pose_resnet18_v1b) and an activity recognition one 
(i3d_nl10_resnet101_v1_kinetics400) as well.





---
[Visit 
Topic](https://discuss.mxnet.apache.org/t/running-custom-object-detection-model-on-android/6884/1)
 or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.apache.org/email/unsubscribe/b2d17ebd57f99f50e72c4317400173fa759a05957f196ac1bbe8ee975076df6a).

Reply via email to