Dear Sir or Madam,
I've encountered a problem when converting my caffemodel into dlc file. Basically, the error message is flagged out as below:
ERROR_CAFFE_LAYER_OF_TYPE_SCALE_NOT_PRECEEDED_BY_BATCHNORM: Cannot resolve Scale layer conv1_1/negative as it is not preceded by a BatchNorm layer into which it can be folded.
For your information, the concerned snippet of my prototxt file is pasted below:
---------------------------------
layer {
name: "data"
type: "Input"
top: "data"
input_param: {
shape: {
dim: 1
dim: 3
dim: 720
dim: 1280
}
}
}
layer {
name: "conv1_1/conv"
type: "Convolution"
bottom: "data"
top: "conv1_1/conv"
convolution_param {
num_output: 16
bias_term: false
pad: 3
kernel_size: 7
stride: 2
}
}
layer {
name: "conv1_1/bn"
type: "BatchNorm"
bottom: "conv1_1/conv"
top: "conv1_1/conv"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv1_1/negative"
type: "Scale"
bottom: "conv1_1/conv"
top: "conv1_1/negative"
scale_param {
num_axes: 1
bias_term: false
}
}
layer {
name: "conv1_1/concat"
type: "Concat"
bottom: "conv1_1/conv"
bottom: "conv1_1/negative"
top: "conv1_1/concat"
}
layer {
name: "conv1_1/scale"
type: "Scale"
bottom: "conv1_1/concat"
top: "conv1_1/concat"
scale_param {
num_axes: 1
bias_term: true
}
}
------------------------
To clarigy, BatchNorm layer conv1_1/bn outputs conv1_1/conv, which is passed into a Scale layer conv1_1/negative to get the negative value of conv1_1/conv, the negative values conv1_1/negative are then concatenated with conv1_1/conv, and the combined blob is then feed into a Scale layer conv1_1/scale to accomplish the batchNorm+scale task.
Would you please share your thoughts on how to avoid this error message?
Cheers,
Hi, yanbophx
I met the same problem, have you solved it? Eager to get your help.
Best wishes!
I get the same problem when trying to convert resnet-18 caffe model as well . If someone is able to solve this problem , please post the solution.
Thanks