I have built a TF network in Keras and froze it as a pb file but when attempting to convert to DLC I see the following error:
*/
ERROR - Conversion failed: ERROR_TF_CONV_RESOLVE_WEIGHTS: Cannot resolve convolution layer due to missing weights for operation: inputNode_1/convolution/Conv2D.
Has anyone encountered an error like this before?
I also built a TF network in Keras and froze it as a pb file ,when attempting to convert to DLC, My error is similar to yours:
"ERROR - Conversion failed: Cannot resolve BatchNorm layer due to missing variance value."
I might be able to help. Could you please run this with your .pb file and share the nodes corresponding to the layer in the error message? (e.g. all the nodes in the "inputNode_1" layer). Will be helpful if you also include the layer just before it.
Thanks shiangyong,
I rebuilt the model in pure tensorflow to eliminate Keras as a possible source of the poblem but am still seeing the following:
I therefore ran the code you suggested against my model and this is the full output:
I am not certian that I am using the correct input node but it's not clear that this would be the cause of the problem.
*/P.S I am using TensorFlow Conv1D layers as follows:
Hi vellamike,
I think you're right. I didn't see any support for Conv1D on the SNPE docs.
I also noticed other potential issues in your model:
- There are dropout nodes that should have been removed when you do variable freezing
- It does seem strange that your input node is the "Reshape" node. The input node usually has the "Placeholder" op
Shiang Yong
I think the dropout nodes only get removed when you run the optimize_for_inference tool, I tried doing this but the origingal issue remained.
I think that a conv1d layer in tensorflow is just a convenient wrapper for conv2D so I'm not completely convinced that this is the issue, but I will try and express the model with conv2D and report back with how well it works.
I think you are right about the Conv1D acting as a wrapper. Those ExpandDims operations just before the Conv2D are likely part of that.
I converted my Conv1D layers to be 2D layers and the error appears to have gone away and been replaced by a new error which I will start a new thread about.
I converted my Conv1D layers to be 2D layers and the error appears to have gone away and been replaced by a new error which I will start a new thread about.