def network(input): conv1=slim.conv2d(input,12,[3,3],rate=1,scope='conv1') output=tf.depth_to_space(conv1, 2, name='output') return output
input=tf.placeholder(tf.float32,[None,512,512,3],name='input') gtruth=tf.placeholder(tf.float32,[None,None,None,3]) out_image=network(input)
loss=tf.reduce_mean(tf.abs(out_image - gtruth))
lr=tf.placeholder(tf.float32)
opt=tf.train.AdamOptimizer(learning_rate=lr).minimize(loss)
After I trained the model and saved it as a pb file, I tried to transform it to dlc file. However, the following conditions occurs.
2018-06-12 14:20:43,751 - 365 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv1/Conv2D) not consumed by converter: Conv2D.
2018-06-12 14:20:43,751 - 365 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (DepthToSpace) not consumed by converter: DepthToSpace.
2018-06-12 14:20:43,751 - 365 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv1/Relu) not consumed by converter: Relu.
2018-06-12 14:20:43,751 - 365 - WARNING - WARNING_TF_SCOPE_OP_NOT_CONSUMED: Operation (conv1/BiasAdd) not consumed by converter: BiasAdd.
2018-06-12 14:20:43,751 - 123 - ERROR - Conversion failed: ERROR_TF_OPERATION_NOT_MAPPED_TO_LAYER: Some operations in the Tensorflow graph were not resolved to a layer. You can use --allow_unconsumed_nodes for partial graph resolution!
I know the 'tf_depth_to_space' is not supported by the snpe, but why other ops like conv2d are not consumed by converter, either? I delete the tf.depth_to_space, the net becomes like this
def network(input): conv1=slim.conv2d(input,12,[3,3],rate=1,scope='conv1') return output
Then everything is ok. I don not know why the 'tf.depth_to_space' can affect other ops?