SNPE version : 1.17.0
Model : Tensorflow MobilenetSSD
Hardware: Snapdragon 820 based board with DSP
My intention is to run MobilenetSSD object detection model on DSP. I have converted coco model into a DLC file.
My code is based on "examples/NativeCpp/SampleCode/" application and I am using ITENSOR method to load input images. I have CPU fallback option enabled as mentioned in the documentation. My code is able to do classification using a DLC file converted from Inception_v3 model on both GPU and DSP runtimes. I am also able to do object detection using a DLC file converted from MobilenetSSD model on GPU, but I am facing issue while running it on DSP. I am able to run it on DSP without any errors but I am getting weird detection results, I see ~100 objects getting detected with their probability set to 100%.
Not sure if MobilenetSSD is supported on DSP. I do see less network execution time compared to GPU as expected.
Any help or pointers will be helpful.
Thanks,
Pragnesh
Hi Pragnesh,
Do you fix the issue?
I have the same issue.