Forums - ReLu support

7 posts / 0 new
Last post
ReLu support
simonr_rob
Join Date: 9 Aug 12
Posts: 7
Posted: Thu, 2017-10-19 05:19

Hi,

I have a working caffe model which I have converted with snpe-caffe-to-dlc with no errors, however I am not getting the expected output from the network when I test it either on android or using snpe-network-run.

After eliminating the input image format being wrong I finally isoloated the problem to the ReLU function not supporting the negative slope parameter - for a leaky relu (as defined below)

layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
  relu_param{
    negative_slope: 0.1
  }
}
 
The documentation shows that ReLu is a supported and has a link to the relu_layer.cpp code which supports this parameter. Is SNPE supposed to support this paramter?
 
Cheers,
Simon
  • Up0
  • Down0
simonr_rob
Join Date: 9 Aug 12
Posts: 7
Posted: Thu, 2017-10-19 06:46

I think I have a work around using PReLU instead..

layer {
  name: "PReLU1"
  type: "PReLU" prelu_param { filler {type: "constant" value: 0.1 } channel_shared: false }
  bottom: "conv1"
  top: "conv1"
}
 
I hope it helps anybody else struggling with this!
 
Cheers,
 
Simon
 
 
  • Up0
  • Down0
jesliger
Join Date: 6 Aug 13
Posts: 75
Posted: Wed, 2017-11-01 10:44

Hi Simon.  Glad you found a workaround!  And thanks for posting it for others to use.

SNPE doesn't support Leaky Relu at this time.  

  • Up0
  • Down0
zl1994
Join Date: 12 Aug 17
Posts: 8
Posted: Wed, 2017-11-15 17:01

hi, jesliger, So is there a way to support the Leaky Relu?

  • Up0
  • Down0
jesliger
Join Date: 6 Aug 13
Posts: 75
Posted: Wed, 2017-11-15 18:18

Does the PReLU workaround that was posted not work for you?  Leaky Relu is not in SNPE at this time.  I'm hoping you can use PReLU for now.

  • Up0
  • Down0
zl1994
Join Date: 12 Aug 17
Posts: 8
Posted: Thu, 2017-11-16 00:33

I wonder how to fix the weight of PReLU in tensorflow, so it executes like leaky relu

  • Up0
  • Down0
kirov
Join Date: 28 Dec 17
Posts: 5
Posted: Wed, 2018-01-31 06:43

Have you tested this on the DSP runtime? I observe huge accuracy errors on the DSP (compared to the other 2 runtimes) when using Prelu in lieu of leaky relu. Even when I reduce it to a basic ReLU by setting alpha to 0.

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.