Forums - NNAPI Concat is slow

3 posts / 0 new
Last post
NNAPI Concat is slow
kyakuno
Join Date: 1 Aug 19
Posts: 3
Posted: Tue, 2023-04-25 23:30
We are developing an inference runtime using Android's NNAPI. By installing the ailia AI showcase published below, yolox inference using NNAPI is possible.
 
Snapdragon 8+ works fine. yolox_tiny can be inferred in about 6ms. However, yolox_tiny inference takes nearly 4000ms on Snapdragon 888.
 
Here, if you put Transpose before Concat and set Transpose -> Concat -> Transpose, it will be 794ms.
 
Snapdragon 888 Concat is very slow. The Snapdragon 855 follows the same trend.
 
Is there a way to speed up Concat?
  • Up0
  • Down0
weihuan
Join Date: 12 Apr 20
Posts: 270
Posted: Sat, 2023-04-29 08:44

Dear developer,

We not fully understand your issue mentioned.

Concat and Transpose are supported on SNPE.

  • Up0
  • Down0
kyakuno
Join Date: 1 Aug 19
Posts: 3
Posted: Sun, 2023-04-30 22:19

Thank you for your comment. We use NNAPI directly, not SNPE.

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.