Forums - convert with batchNormalization

4 posts / 0 new
Last post
convert with batchNormalization
1225541074
Join Date: 21 Mar 18
Posts: 3
Posted: Tue, 2018-06-12 03:37

I trained a model with 'slim.batch_norm',  when I converted the pb file to dlc, the following occurs. Does anybody knows?

2018-06-12 18:12:27,304 - 126 - ERROR - Encountered Error: operands could not be broadcast together with shapes (0,) (64,)
Traceback (most recent call last):
  File "/home/user/Android/snpe-1.15.0/bin/x86_64-linux-clang/snpe-tensorflow-to-dlc", line 120, in main
    converter.convert(args.dlc, args.model_version, converter_command)
  File "/home/user/Android/snpe-1.15.0/lib/python/converters/tensorflow/converter.py", line 304, in convert
    self._convert_layers()
  File "/home/user/Android/snpe-1.15.0/lib/python/converters/tensorflow/converter.py", line 340, in _convert_layers
    descriptors = self._resolve_descriptors_from_nodes(graph_ops)
  File "/home/user/Android/snpe-1.15.0/lib/python/converters/tensorflow/converter.py", line 439, in _resolve_descriptors_from_nodes
    resolved_descriptors = resolver.resolve_layer(graph_matcher, self._graph_helper)
  File "/home/user/Android/snpe-1.15.0/lib/python/converters/tensorflow/layers/batchnorm.py", line 312, in resolve_layer
    beta=beta))
  File "/home/user/Android/snpe-1.15.0/lib/python/converters/tensorflow/layers/batchnorm.py", line 55, in __init__
    scaled_stddev = stddev * scale
ValueError: operands could not be broadcast together with shapes (0,) (64,)
 

  • Up0
  • Down0
Danziv
Join Date: 6 Aug 18
Posts: 2
Posted: Thu, 2018-08-09 06:17

Hi, i had the exact same issue with my own model, make sure you supply training=False to all your batch_norm_layers. Sadly now im facing a different issue where the conversion takes forever exhaust all my pc resources and eventually fails.

  • Up0
  • Down0
shenyingying25
Join Date: 14 Nov 18
Posts: 19
Posted: Mon, 2019-04-15 01:58

hi guys:

       why should i set training=false,when i use the batch_norm layers,and last,have you sove the problem? looking for your replay,thanks a lot

  • Up0
  • Down0
gesqdn-forum
Join Date: 4 Nov 18
Posts: 184
Posted: Fri, 2019-04-26 03:03

Hi,
The batch normalization layer (slim.batch_norm) you are using is not supported in SNPE. 
can you try  tf.layers.batch_normalization which is SNPE supported layer.
For more details on usage of tf.layers.batch_normalization layer click here

To know details on supported layers kindly check SNPE Supported layer details

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.