Forums - Runtime Error SNPE

6 posts / 0 new
Last post
Runtime Error SNPE
luca.padovan
Join Date: 28 Sep 18
Posts: 12
Posted: Thu, 2018-10-18 06:03

I am using Snapdragon Neural Processing Engine (SNPE) 1.19.2 and i am trying to build the android app example with android studio. I also run the setup_inception_v3.py script, and i have theinception_v3.zip in res/raw folder. When i build the project I get a success message. During runtime, when i try to build the network, with the build() method:

final SNPE.NeuralNetworkBuilder builder = new SNPE.NeuralNetworkBuilder(mApplication)
        .setDebugEnabled(false)
        .setRuntimeOrder(mTargetRuntime)
        .setModel(mModel.file)
        .setCpuFallbackEnabled(true)
        .setUseUserSuppliedBuffers(mTensorFormat != SupportedTensorFormat.FLOAT);

final long start = SystemClock.elapsedRealtime();
network = builder.build();

  I get the following error:

E/LoadNetworkTask: Unable to create network! Cause: error_code=307; error_message=Model record is missing in dlc. Missing mandatory record model; error_component=Dl Container; line_no=497; thread_id=547841111120
    java.lang.IllegalStateException: Unable to create network! Cause: error_code=307; error_message=Model record is missing in dlc. Missing mandatory record model; error_component=Dl Container; line_no=497; thread_id=547841111120
        at com.qualcomm.qti.snpe.internal.NativeNetwork.nativeInitFromFile(Native Method)
        at com.qualcomm.qti.snpe.internal.NativeNetwork.<init>(NativeNetwork.java:104)
        at com.qualcomm.qti.snpe.SNPE$NeuralNetworkBuilder.build(SNPE.java:272)
        at com.example.luca.myapplication2.tasks.LoadNetworkTask.doInBackground(LoadNetworkTask.java:58)
        at com.example.luca.myapplication2.tasks.LoadNetworkTask.doInBackground(LoadNetworkTask.java:18)
        at android.os.AsyncTask$2.call(AsyncTask.java:305)
        at java.util.concurrent.FutureTask.run(FutureTask.java:237)
        at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
        at java.lang.Thread.run(Thread.java:761)


I try with ubuntu 14.04,16.04 and 18.04 and i get the same problem.

How to fix this problem? Thanks.

 

 

 

 

  • Up0
  • Down0
shenyingying25
Join Date: 14 Nov 18
Posts: 16
Posted: Sun, 2018-11-25 18:55

In the last, have you resolve the probelm, it occurs to me  the same

  • Up0
  • Down0
paul.poloniewicz
Join Date: 17 Dec 18
Posts: 1
Posted: Tue, 2019-01-08 06:27

Hi. I am having the same problem - were you ever able to resolve the problem?

  • Up0
  • Down0
gesqdn-forum
Join Date: 4 Nov 18
Posts: 34
Posted: Tue, 2019-02-05 03:27

Hi
After running the setup_inception_v3.py script you have to follow few more steps to prepare the APP by copying the runtime and the model.

Bellow are the instructions to follow after executing “setup_inception_v3.py”,
$ cd $SNPE_ROOT/examples/android/image-classifiers
$ cp ../../../android/snpe-release.aar ./app/libs      # copies the NPE runtime library
$ bash ./setup_inceptionv3.sh                                      # packages the Inception_v3 example (DLC, labels, inputs) as an Android resource file

Follow the SNPE setup guide from QDN https://developer.qualcomm.com/software/qualcomm-neural-processing-sdk/getting-started.

  • Up0
  • Down0
cciedump.spoto
Join Date: 23 Jan 19
Posts: 1
Posted: Thu, 2019-02-07 10:59

I have not tried –use_gpu or –use_dsp options because i know those won’t work without KGSL and FastRPC support. I am not sure about making GPU runtime working but for aDSP, I think we can port FastRPC. I would like to know if some is already working on GPU support.

 

 

  • Up0
  • Down0
gesqdn-forum
Join Date: 4 Nov 18
Posts: 34
Posted: Mon, 2019-02-11 21:50

Hi Cciedump,

We worked on GPU runtime, can you let us know what kind of support you are expecting for GPU support.



 

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.