Forums - Generating .dlc from tensorflow checkpoint meta file

3 posts / 0 new
Last post
Generating .dlc from tensorflow checkpoint meta file
koumis
Join Date: 21 Sep 17
Posts: 18
Posted: Thu, 2018-02-15 22:02
I am attempting to generate a dlc from tensorflow meta files generated by the following example script:
 
 
#!/usr/bin/env python
 
import os
import tensorflow as tf
 
class Network(object):
 
    def __init__(self, input_size, learning_rate=0.001):
 
        rows, cols, depth = input_size
 
        with tf.variable_scope('input_layer'):
            self.input_layer = tf.placeholder(tf.float32,
                shape=[None, rows, cols, depth],
                name='Network_Input')
 
        with tf.variable_scope('conv_1'):
            # First hidden layer
            conv_1 = tf.layers.conv2d(
                inputs=self.input_layer,
                filters=16,
                kernel_size=[8, 8],
                strides=(4, 4),
                padding='SAME',
                activation=tf.nn.relu
            )
 
        with tf.variable_scope('conv_2'):
            # Second hidden layer
            conv_2 = tf.layers.conv2d(
                inputs=conv_1,
                filters=32,
                kernel_size=[4, 4],
                strides=(2, 2),
                padding='SAME',
                activation=tf.nn.relu
            )
 
        with tf.variable_scope('conv_2_flat'):
            conv_2_flat = tf.contrib.layers.flatten(conv_2)
 
        with tf.variable_scope('dense_3'):
            # Third hidden layer
            dense_3 = tf.layers.dense(inputs=conv_2_flat, units=256, activation=tf.nn.relu)
 
        with tf.variable_scope('output_layer'):
            # Output layer
            self.output_layer = tf.layers.dense(inputs=dense_3, units=2, activation=None, name='Network_Output')
 
        # Do I need to specify shape?
        self.target = tf.placeholder(tf.float32, name='Network_Target')
 
        self.loss = tf.reduce_mean(tf.square(tf.subtract(self.output_layer, self.target)))
 
        self.optm = tf.train.RMSPropOptimizer(learning_rate=learning_rate).minimize(self.loss)
 
        init = tf.global_variables_initializer()
        self.saver = tf.train.Saver()
        self.sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))
        self.sess.run(init)
 
    def save(self, path):
        save_path_model = self.saver.save(self.sess, os.path.join(path, 'model.ckpt'))
        print('Model saved in file: {}'.format(save_path_model))
        return path
 
 
network = Network((64, 64, 12))
network.save('/home/ubuntu/Desktop/tensorflow/export')
 
Running snpe-tensorflow-to-dlc on the generated meta file(s), I get the following warnings:
 
➜  tensorflow snpe-tensorflow-to-dlc --graph ./export/model.ckpt.meta -i "input_layer/Network_Input" 64,64,12 --out_node output_layer/Network_Output/MatMul
2018-02-15 10:45:13.231736: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
2018-02-15 10:45:13,413 - 309 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv_2_flat_layer/Flatten/flatten/Shape) not consumed by converter: Shape.
2018-02-15 10:45:13,413 - 309 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv_2_flat_layer/Flatten/flatten/strided_slice) not consumed by converter: StridedSlice.
2018-02-15 10:45:13,413 - 309 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv_2_flat_layer/Flatten/flatten/Reshape/shape) not consumed by converter: Pack.
2018-02-15 10:45:13,413 - 309 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (output_layer/Network_Output/MatMul) not consumed by converter: MatMul.
2018-02-15 10:45:13,413 - 123 - ERROR - Conversion failed: ERROR_TF_OPERATION_NOT_MAPPED_TO_LAYER: Some operations in the Tensorflow graph were not resolved to a layer. You can use --allow_unconsumed_nodes for partial graph resolution!
 
I do not understand what it means that layers are not consumed, especially the flatten layer. The flatten layer (conv_2_flat) is used as an input to the fully connected layer dense_3.
 
I have a feeling I am using the wrong argument for "--out_node". I write "output_layer/Network_Output/MatMul" instead of just "output_layer/Network_Output" because when I wrote that the program quit with an error saying that "output_layer/Network_Output" does not exist. I printed out all the layers that it was aware if and there were many that used "output_layer/Network_Output" as a prefix (MatMul included) but no layer by itself with that name. I picked one that sounded right (MatMul) and the program got passed that layer, but it feels wrong.
 
What am I doing wrong?
 
Thanks
Alexander
  • Up0
  • Down0
amirajaee
Join Date: 31 Jul 18
Posts: 14
Posted: Thu, 2019-06-06 18:45

any update on this?

  • Up0
  • Down0
gesqdn-forum
Join Date: 4 Nov 18
Posts: 184
Posted: Tue, 2019-07-23 02:43
Hi,
 
This issue is because  'tf.contrib.layers.flatten(conv_2)' used in the flatten layer(conv_2_flat), is not supported by SNPE. 
 
I ran your code using 'tf.reshape()', by replacing 'conv_2_flat = tf.contrib.layers.flatten(conv_2)' with 'tf.reshape(conv_2, [-1, 8 * 8 * 32])'.
And I am successfully able to convert the model into .DLC format using 'snpe-tensorflow-to-dlc tool'.
 
Further while converting .meta file to .dlc you are using wrong argument for '-i' where you have to also mention about batch size in input dimensions. If you are not aware of batch size then keep it as 1 as shown below.
 
'snpe-tensorflow-to-dlc --graph ./export/model.ckpt.meta -i "input_layer/Network_Input" 1,64,64,12 --out_node output_layer/Network_Output/MatMul'
  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.