Hello-
I found that when trying to convert ssd_mobilenet_v3_small_coco_2020_01_14 from TF's object detection model zoo that snpe-tensorflow-to-dlc enters an infinite loop and after many hours of leaking memory eventually crashes with OOM. Below are the steps I used for reproducibility. Any support you can provide on importing this model is highly appreciate, thanks!
Freezing TF MobileNetV3-SSD:
git clone git at github.com:tensorflow/models.git tensorflow_models cd tensorflow_models/ git fetch tag git tag git checkout v1.11 cd research/ pip3 install tensorflow==1.11 mkdir ssd_export pushd man push man pushd pushd --help export INPUT_TYPE=image_tensor export PIPELINE_CONFIG_PATH=/home/user01/org/snpe-1.40.0.2130/models/ssd_mobilenet_v3_small_coco_2020_01_14/pipeline.config export TRAINED_CKPT_PREFIX=/home/user01/org/snpe-1.40.0.2130/models/ssd_mobilenet_v3_small_coco_2020_01_14/model.ckpt export EXPORT_DIR=./ssd_export pushd ~/org/tf-models/models/research/ python3.5.9 object_detection/export_inference_graph.py --input_type=${INPUT_TYPE} --pipeline_config_path=${PIPELINE_CONFIG_PATH} --trained_checkpoint_prefix=${TRAINED_CKPT_PREFIX} --output_directory=${EXPORT_DIR}
Converting frozen TF graph def. to SNPE DLC format (encounters infinite loop with memory leak):
(venv3.5.9) python ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc --input_network ../tf-models/research/ssd_export/frozen_inference_graph.pb --input_dim Preprocessor/sub 1,300,300,3 --out_node detection_classes --out_node detection_boxes --out_node detection_scores --output_path mobilenetv3_ssd.tf.dlc --allow_unconsumed_nodes
(venv3.5.9) user01@user01-desktop ~/org/snpe-1.40.0.2130 $ python ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc --input_network ../tf-models/research/ssd_export/frozen_inference_graph.pb --input_dim Preprocessor/sub 1,300,300,3 --out_node detection_classes --out_node detection_boxes --out_node detection_scores --output_path mobilenetv3_ssd.tf.dlc --allow_unconsumed_nodes WARNING:tensorflow:From ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc:33: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead. 2020-08-28 13:53:51,193 - 139 - WARNING - From ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc:33: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead. WARNING:tensorflow:From ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc:33: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead. 2020-08-28 13:53:51,194 - 139 - WARNING - From ./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc:33: The name tf.ConfigProto is deprecated. Please use tf.compat.v1.ConfigProto instead. 2020-08-28 13:53:51.218351: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 3392345000 Hz 2020-08-28 13:53:51.219134: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55c9ce74aa60 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2020-08-28 13:53:51.219169: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version WARNING:tensorflow:From /home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/loader.py:146: The name tf.GraphDef is deprecated. Please use tf.compat.v1.GraphDef instead. 2020-08-28 13:53:51,220 - 139 - WARNING - From /home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/loader.py:146: The name tf.GraphDef is deprecated. Please use tf.compat.v1.GraphDef instead. 2020-08-28 13:53:53.987529: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5524] has already been set. 2020-08-28 13:53:53.987607: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5525] has already been set. 2020-08-28 13:53:53.987650: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5522] has already been set. 2020-08-28 13:53:53.987669: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5523] has already been set. 2020-08-28 13:53:53.987695: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5517] has already been set. 2020-08-28 13:53:53.987759: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5515] has already been set. 2020-08-28 13:53:53.987782: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5536] has already been set. 2020-08-28 13:53:53.987816: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5516] has already been set. 2020-08-28 13:53:53.987845: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5518] has already been set. 2020-08-28 13:53:53.987869: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5520] has already been set. 2020-08-28 13:53:53.987886: W tensorflow/core/framework/op_kernel.cc:1651] OP_REQUIRES failed at function_ops.cc:69 : Internal: Retval[5521] has already been set. 2020-08-28 13:53:54,088 - 403 - WARNING - ERROR_TF_FALLBACK_TO_ONDEMAND_EVALUATION: Unable to resolve operation output shapes in single pass. Using on-demand evaluation! 2020-08-28 13:53:54,091 - 171 - INFO - INFO_ALL_BUILDING_NETWORK: ============================================================== Building Network ============================================================== 2020-08-28 23:56:40,778 - 166 - ERROR - Encountered Error: Traceback (most recent call last): File "./bin/x86_64-linux-clang/snpe-tensorflow-to-dlc", line 37, in main ir_graph = converter.convert() File "/home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/tf_to_ir.py", line 317, in convert self._convert_layers() File "/home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/tf_to_ir.py", line 352, in _convert_layers descriptors = self._resolve_descriptors_from_nodes(graph_ops) File "/home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/tf_to_ir.py", line 491, in _resolve_descriptors_from_nodes resolved_descriptors = resolver.resolve_layer(graph_matcher, self._graph_helper) File "/home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/layers/reshape.py", line 103, in resolve_layer _, _, consumed_nodes = graph_helper.get_static_data_info(reshape_input) File "/home/user01/org/snpe-1.40.0.2130/lib/python/qti/aisw/converters/tensorflow/util.py", line 456, in get_static_data_info queue.extend(head.op.inputs) MemoryError
i'm facing the same error. Please let me know if this issue is resolved.
I have encountered the same issue converting SSD/SSDLite+MobileNetV2 frozen inference graph to DLC.
TensorFlow version: 1.15
SNPE version: 1.42
The command:
snpe-tensorflow-to-dlc --input_network ssdlite_mobilenet_v2.pb --input_dim Preprocessor/sub 1,320,320,3 --out_node detection_boxes --output_path ssdlite_mobilenet_v2.dlc --allow_unconsumed_nodes
freezes after outputting
==============================================================
Building Network
==============================================================
SNPE 1.43.0.2307
TensorFlow 1.15
Python 3.5.2
Network: SSDLite+MobileNetV2 trained with TensorFlow 1.15.
Conversion:
snpe-tensorflow-to-dlc --input_network ${INPUT_MODEL} --input_dim Preprocessor/sub 1,320,320,3 --out_node detection_classes --out_node detection_boxes --out_node detection_scores --output_path ${OUTPUT_MODEL} .dlc --allow_unconsumed_nodes --show_unconsumed_nodes --debug
Output
The snpe-dlc-info gives for it: