Snapdragon Neural Processing Engine SDK
Reference Guide
|
This tutorial demonstrates how to build a C++ sample application that can execute neural network models on the PC or target device.
Note: While this sample code does not do any error checking, it is strongly recommended that users check for errors when using the SNPE APIs.
Most applications will follow the following pattern while using a neural network:
The sections below describe how to implement each step described above. For more details, please refer to the collection of source code files located at $SNPE_ROOT/examples/NativeCpp/SampleCode/jni.
The code excerpt below illustrates how to check if a specific runtime is available using the native APIs (the GPU runtime is used as an example).
The code excerpt below illustrates how to load a network from the SNPE container file (DLC).
The code excerpt below illustrates how to load UDO package(s).
SNPE can execute network with user-defined operations (udo). Please refer to UDO Tutorial to implement an udo.
Then adding "-u" option after snpe-sample to execute.
The following code demonstrates how to instantiate a SNPE Builder object, which will be used to execute the network with the given parameters.
Network inputs and outputs can be either user-backed buffers or ITensors (built-in SNPE buffers), but not both. The advantage of using user-backed buffers is that it eliminates an extra copy from user buffers to create ITensors. Both methods of loading network inputs are shown below.
SNPE can create its network inputs and outputs from user-backed buffers. Note that SNPE expects the values of the buffers to be present and valid during the duration of its execution.
Here is a function for creating a SNPE UserBuffer from a user-backed buffer and storing it in a zdl::DlSystem::UserBufferMap. These maps are a convenient collection of all input or output user buffers that can be passed to SNPE to execute the network.
Disclaimer: The strides of the buffer should already be known by the user and should not be calculated as shown below. The calculation shown is solely used for executing the example code.
The following function then shows how to load input data from file(s) to user buffers. Note that the input values are simply loaded onto user-backed buffers, on top of which SNPE can create SNPE UserBuffers, as shown above.
The following snippets of code use the native API to execute the network (in UserBuffer or ITensor mode) and show how to iterate through the newly populated output tensor.
The Following snippet of code shows how to specify the data type for a buffer using the native API.
Start by going to the snpe-sample base directory.
cd $SNPE_ROOT/examples/NativeCpp/SampleCode
Note the different makefiles associated with the different Linux platform. Note that the $CXX would need to be set according to the target platform. Here is a table of the supported targets, and their corresponding settings for $CXX and the Makefiles to use.
Target | Makefile | Possible CXX value | Output Location |
---|---|---|---|
arm-oe-linux (gcc 6.4hf) | Makefile.arm-oe-linux-gcc6.4hf | arm-oe-linux-g++ | arm-oe-linux-gcc6.4hf |
aarch64-oe-linux (gcc 6.4) | Makefile.aarch64-oe-linux-gcc6.4 | aarch64-oe-linux-g++ | aarch64-oe-linux-gcc6.4 |
x86_64-linux | Makefile.x86_64-linux-clang | g++ | x86_64-linux-clang |
export CXX=<Name of c++ cross compiler> make -f <Makefile for the target>
Note: Ensure that the path to the compiler binary is already set in $PATH.
Along with the sample executable, all other libraries need to be pushed onto their respective targets. The $LD_LIBRARY_PATH may also need to be updated to point to the support libraries. You can run the executable with -h argument to see its description.
snpe-sample -h
The description should look like the following:
DESCRIPTION: ------------ Example application demonstrating how to load and execute a neural network using the SNPE C++ API. REQUIRED ARGUMENTS: ------------------- -d <FILE> Path to the DL container containing the network. -i <FILE> Path to a file listing the inputs for the network. -o <PATH> Path to directory to store output results. OPTIONAL ARGUMENTS: ------------------- -b <TYPE> Type of buffers to use [USERBUFFER, ITENSOR] (default is USERBUFFER). -u <VAL,VAL> Path to UDO package with registration library for UDOs. Optionally, user can provide multiple packages as a comma-separated list.
Running the snpe-sample assumes one of the examples Running the AlexNet Model or Running the Inception v3 Model has been previously setup.
Run snpe-sample with the AlexNet model:
cd $SNPE_ROOT/models/alexnet/data $SNPE_ROOT/examples/NativeCpp/SampleCode/obj/local/x86_64-linux-clang/snpe-sample -b ITENSOR -d ../dlc/bvlc_alexnet.dlc -i target_raw_list.txt -o output
The results are stored in the output directory. To process the output run the following script to generate the classifiscation results.
python $SNPE_ROOT/models/alexnet/scripts/show_alexnet_classifications.py -i target_raw_list.txt -o output/ -l ilsvrc_2012_labels.txt Classification results cropped/trash_bin.raw 0.949348 412 ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin cropped/chairs.raw 0.365685 831 studio couch, day bed cropped/plastic_cup.raw 0.749103 647 measuring cup cropped/notice_sign.raw 0.722709 458 brass, memorial tablet, plaque cropped/handicap_sign.raw 0.188248 919 street sign
Prerequisite: You will need the Android NDK to build the Android C++ executable. The tutorial assumes that you can invoke 'ndk-build' from the shell.
First move to snpe-sample's base directory.
cd $SNPE_ROOT/examples/NativeCpp/SampleCode
To build snpe-sample with clang/libc++ SNPE binaries (i.e., arm-android-clang6.0 and aarch64-android-clang6.0), use the following command:
cd $SNPE_ROOT/examples/NativeCpp/SampleCode ndk-build NDK_TOOLCHAIN_VERSION=clang APP_STL=c++_shared
The ndk-build command will build both armeabi-v7a and arm64-v8a binaries of snpe-sample.
To run the Android C++ executable, push the appropriate SNPE libraries and the executable onto the Android target.
export SNPE_TARGET_ARCH=arm-android-clang6.0 export SNPE_TARGET_ARCH_OBJ_DIR=armeabi-v7a adb shell "mkdir -p /data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/bin" adb shell "mkdir -p /data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/lib" adb shell "mkdir -p /data/local/tmp/snpeexample/dsp/lib" adb push $SNPE_ROOT/lib/$SNPE_TARGET_ARCH/ /data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/lib adb push $SNPE_ROOT/lib/dsp/ /data/local/tmp/snpeexample/dsp/lib adb push $SNPE_ROOT/examples/NativeCpp/SampleCode/obj/local/$SNPE_TARGET_ARCH_OBJ_DIR/snpe-sample /data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/bin
Run snpe-sample with the Alexnet model on the target. This assumes that you have done the setup steps for running Run on Android Target to push to the target all the sample data files and Alexnet model.
adb shell export SNPE_TARGET_ARCH=arm-android-clang6.0 export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/lib export PATH=$PATH:/data/local/tmp/snpeexample/$SNPE_TARGET_ARCH/bin cd /data/local/tmp/alexnet snpe-sample -b ITENSOR -d bvlc_alexnet.dlc -i target_raw_list.txt -o output_sample exit
Pull the target output into a host side output directory.
cd $SNPE_ROOT/models/alexnet/ adb pull /data/local/tmp/alexnet/output_sample output_sample
Again, we can run the interpret script to see the classification results.
python $SNPE_ROOT/models/alexnet/scripts/show_alexnet_classifications.py -i data/target_raw_list.txt -o output_sample/ -l data/ilsvrc_2012_labels.txt Classification results cropped/trash_bin.raw 0.949348 412 ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin cropped/chairs.raw 0.365685 831 studio couch, day bed cropped/plastic_cup.raw 0.749103 647 measuring cup cropped/notice_sign.raw 0.722709 458 brass, memorial tablet, plaque cropped/handicap_sign.raw 0.188248 919 street sign
Similar example results can also be used using the Inception v3 model from Running the Inception v3 Model.