Forums - dlc convert

5 posts / 0 new
Last post
dlc convert
blackxin55
Join Date: 25 Jul 18
Posts: 1
Posted: Fri, 2018-07-27 04:37

qualcomm snpe support resnet 50? conver resnet50 to dlc failed.

./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc  --graph /home/tensorflow/sigma/tensorflow-master/resnet_v2_101.pb   --input_dim input "1,224,224,3" --out_node "resnet_v2_101/predictions/Reshape_1" --dlc mobile.dlc    --allow_unconsumed_nodes
where is wrong? Dones't qualcomm snpe spport resnet50 ?
  • Up0
  • Down0
Enrico Ros
Join Date: 14 Jul 17
Posts: 10
Posted: Wed, 2018-08-01 11:23

Please also provide the error messages produced by the command above.

  • Up0
  • Down0
zf.africa
Join Date: 15 Jun 17
Posts: 51
Posted: Mon, 2018-08-20 03:10

Hi Enrico,

I have also tried a model trained with code: https://github.com/mrharicot/monodepth

When converting checkpoint or frozed model to dlc model, the convert tool is just hang in session run:

snpe-1.17.0/lib/python/converters/tensorflow/util.py:

Quote:

232         if len(requiring_evaluation) > 0:
233             try:
234                 outputs = self._session.run(fetches=requiring_evaluation, feed_dict=input_tensors)
235                 outputs = dict(zip(requiring_evaluation, outputs))
236                 for t, o in outputs.iteritems():
237                     self._tensor_value_cache[t.name] = o
238                 outputs_map.update(outputs)
239                 requiring_evaluation = []
240             except InvalidArgumentError:
241                 pass

So why would it hang in session run?

My convert command line is :

Quote:

./snpe-tensorflow-to-dlc --graph /home/damon/work/monodepth/my_model_resnet/frozen_model.pb -i split 1,256,512,3 --out_node model/decoder/mul_21 --dlc ~/work/monodepth/my_model_resnet/frozen_model.dlc

Any comments would be appreciated! Thanks.

  • Up0
  • Down0
manasa
Join Date: 8 May 18
Posts: 7
Posted: Mon, 2018-10-08 18:56

I'm trying to do the same, converting a tensorflow ResNet-50_v1 model to dlc and facing errors . Model itself is not getting loaded . Can someone  suggest how to fix this.

Im using SNPE 1.19.2 and tensorflow 1.6 ( since the documentation says that SNPE is tested with this version of tensorflow).

Upgrading tensorflow version to 1.11 didnt help either.

snpe-tensorflow-to-dlc --graph /home/test/Desktop/ResNet-50_v1/Frozen_graphs/resnet50_v1.pb --input_dim "input" 1,224,224,3 --out_node resnet_v1_50/SpatialSqueeze --dlc /home/test/snpe-sdk/models/tf_resnet50v1/ --allow_unconsumed_nodes --verbose

ERROR

2018-10-08 18:46:28.372973: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2018-10-08 18:46:28,374 - 109 - ERROR - Encountered Error: Error parsing message
Traceback (most recent call last):
  File "/home/test/snpe-sdk/bin/x86_64-linux-clang/snpe-tensorflow-to-dlc", line 99, in main
    model = loader.load(args.graph, in_nodes, in_dims, args.in_type, args.out_node, session)
  File "/home/test/snpe-sdk/lib/python/converters/tensorflow/loader.py", line 50, in load
    graph_def = self.__import_graph(graph_pb_or_meta_path, session, out_node_names)
  File "/home/test/snpe-sdk/lib/python/converters/tensorflow/loader.py", line 102, in __import_graph
    graph_def = cls.__import_from_frozen_graph(graph_path)
  File "/home/test/snpe-sdk/lib/python/converters/tensorflow/loader.py", line 115, in __import_from_frozen_graph
    graph_def.ParseFromString(f.read())
DecodeError: Error parsing message

  • Up0
  • Down0
csiitism
Join Date: 31 Oct 19
Posts: 6
Posted: Thu, 2019-12-26 06:07

question is how you are selecting the  --input_dim [shape is [?,?,?,3]] as input layer and why resnet_v2_101/predictions/Reshape_1 is a output layer [are you able to get boxes label and score from there] ????

please confirm

 

Thankyou

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.