Image classification on QCS610 development kit

This project is designed to implement image scene classification application on the Thundercomm TurboX C610 development board.

The main objective of this project is to detect the scene that is getting captured by the camera with the QCS610 device and classify it into different scenes using a pretrained AI model.

EquipmentLink
Thundercomm TurboX C610 development boardhttps://www.thundercomm.com/app_en/product/1593776185472315?index=1&categoryId=categorynull
USB CablesFor serial console interface, ADB and fastboot commands. USB3.0 Type C port for connecting to the board and flashing images.
DescriptionLink
Thundercomm TurboX C610 Platform Linux User GuideTechnical Documents Section https://www.thundercomm.com/app_en/product/1593776185472315?index=1&categoryId=categorynull

About the project
In this Application, our goal is to build and deploy the image scene classification application on the QCS610 platform. We have trained the model to perform the classification of ten different indoor/outdoor activities, and details of the classes is given in labelmap.txt file.

Prerequisite

  • Ubuntu System 18.04 or above
  • Install Adb tool (Android debugging bridge)
  • Install Tensorflow 1.13 or higher
  • Flash latest image of TurboX c610 board
  • Set up the Application SDK on the host system as given in the Thundercomm document

Gstreamer Plugin:
Gstreamer has been used for the development of this project. It is a framework for creating streaming media applications. Qualcomm Technologies, Inc. (QTI) has provided its own version of GStreamer plugins, which allows you to load and execute AI models as well as capture and stream video. It currently has support for inferencing the Qualcomm® Neural Processing SDK for AI (previously known as SNPE) and TFLite model. In this project we have used the qtitflite plugin for inference of the application model. It has various support for pre and post processing.

For additional references visit the following link:
https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/application-semantics/gstreamer-plugins

Install the application SDK
To Install application SDK:

  1. Download the Application SDK:
    https://thundercomm.s3.ap-northeast1.amazonaws.com/shop/doc/1593776185472315/TurboxC610_Application-SDK_v1.0.tar.gz
  2. unpack the SDK using the command below
    tar -xzvf Turbox-C610_Application-SDK_v1.0.tar.gz
  3. Execute the below script file, it will ask for the default target directory, press
    Enter and type Y
    ./oecore-x86_64-armv7ahf-neon-toolchain-nodistro.0.sh
    this complete the environment setup.

Model Conversion:
In order to deploy the model into the target board, we need to convert the tensorflow/keras model into tflite or dlc format (snpe).

  1. Convert keras model into tflite model
    Run below command
    tflite_convert --keras_model_file=model.h5 --output_file=model.tflite

Steps to build and run the Application:

  1. Steps for running tflite model on the TurboX c610 target board To run classification model on the target system, follow below steps
    Step-1 : Enter below command to set up the build environment on the host system.
    $ source /usr/local/oecore-x86_64/environment-setup-armv7ahf-neon-oe-linux-gnueabi
    Step-2 : Building the binary for gstreamer image classification source code, run below command on host system.$ $CC image_classification.c classification.h -o classific `pkg-config --cflags --libs gstreamer-1.0`
    Step-3 : Create directory named c610 in the data directory of the TurboX c610 target board.
    $ adb shell
    $ adb root
    $ adb remount
    $ adb shell mount -o remount,rw /
    # mkdir /data/c610
    # exit

    Step-4 : Push the AI models, labels and binary file to the board from the host system.
    $ adb push model.tflite /data/c610
    $ adb push labelmap.txt /data/c610
    $ adb push classific /bin/

    Step-5 : Execute the binary file in the adb shell environment. The binary file takes three command line arguments. First argument is runtime either default/dsp (‘default’ for cpu), second and third argument are tflite model and label.
    $ adb shell
    # chmod +x /bin/classific
    # classific dsp /data/c610/model.tflite /data/c610/labelmap.txt

    After executing above command, the application starts inference on the tflite model on the target board, the output will be stored in the path ‘/data/c610/video.mp4’
    The QCS610 starts capturing video from the camera feed and the same will be fed into classification tflite model. Output metadata of the model can be read by qtioverlay plugin and displayed on output video stream. To stop this video capture press Ctrl + C.

    Step -6 : To get the output video
    Using adb pull command you can get the output video with classification label on it
    $ adb pull /data/c610/video.mp4

Note :

  1. SNPE models will not work with gstreamer as qtimlesnpe element is not available in GStreamer.
  2. In tflite inference, when we are running in dsp mode, we are getting classification label information displayed on the video output, whereas when we are running in ‘default’ runtime(cpu) option, we are not getting label information on video output.

Qualcomm QCS610, and Qualcomm Neural Processing SDK are products of Qualcomm Technologies, Inc., and/or its subsidiaries.

NameTitle Company
Rakesh Sankar
[email protected]
Sr, System Architect
Global Edge Software Ltd
Ashish Tiwari
[email protected]
Architect
Global Edge Software Ltd
Ramsingh G
[email protected]
Senior Software Engineer
Global Edge Software Ltd
Keerthi M K
[email protected]
Software Engineer
Global Edge Software Ltd