Hello,
I am currently developing an Android application to run a quantized yolox dlc model using SNPE. I have encountered an issue where the model inference works fine on CPU and GPU runtimes, but fails on the DSP runtime with the following error:
com.qualcomm.qti.snpe.SnpeError$NativeException: Unable to create network! Cause: error_code=1002; error_message=Layer parameter value is invalid. No backend could validate Op=strided_slice_0 Type=StridedSlice error code=3110; error_component=Model Validation; line_no=131; thread_id=492875624144
Background:
Model: Yolox nano
Conversion Process:
- Converted 'yolox_nano.onnx' to DLC format using 'snpe-onnx-to-dlc'.
- Quantized the DLC model using 'snpe-dlc-quantize'.
It looks like you're encountering an issue with the DSP runtime in the Qualcomm Neural Processing SDK while trying to deploy a quantized YOLOX model. The error message suggests that there is a problem with the StridedSlice layer during the model validation process.
Thank you for your response. I checked for version compatibility as you suggested. I changed the SNPE version to 1.61.40.4243, and after converting and quantizing the model again, I was able to run the inference on the DSP runtime successfully. Thank you for your assistance.
I'm currently trying to use a DLC model converted from the YOLOX_s model, but the inference results using the DLC are abnormal. Could there be an issue in the snpe-onnx-to-dlc conversion process? When I used the original ONNX model for inference, it completed normally. However, the results from the DLC model have class IDs fixed at 0 and 5, and the confidence values are strange, ranging from 0.024 to -0.0002 instead of between 0 and 1. Could this be due to SNPE version issues? I would appreciate your help.