Forums - Long DSP runtime Initialization Time

2 posts / 0 new
Last post
Long DSP runtime Initialization Time
fangwen_tu
Join Date: 14 Sep 20
Posts: 1
Posted: Sun, 2022-01-16 01:09

Hi,

 I tested a pytorch model of Unet with only 2d convolution, down/up sampling and Relu operators and converted to dlc with snpe-pytorch-to-dlc and quantized the model with snpe-dlc-quantize. When I deployed the model to run on snapdragon 888 DSP with snpe-net-run, the frist trial with originial parameters resulted in a fast initliaztion time (Accelerator Init Time). But when I reduced the network by half (reduce the channel number of each layer by half), the initialization time increased 5 times, which I cannot figure out the reasons since apart from the model width, all the other operators remains the same.

It is much appreciated if anyone can help to explain the situation. Thanks.

  • Up0
  • Down0
weihuan
Join Date: 12 Apr 20
Posts: 270
Posted: Sun, 2022-01-16 17:30

Dear customer,

What's conversion commands you used? 

We're reommending to quantize the model with enable_htp options if you executed your model on DSP 888. The command is like as below

      snpe-dlc-quantize --input_dlc ${MODEL}.dlc \
            --input_list ${INPUT_LIST} \
            --output_dlc ${MODEL}_quant.dlc \
            --enable_htp \
            --act_bitwidth ${BW} \
            --override_params

 

BR.

Wei

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.