I trained a qat model, I got an error when I tried to convert it to dlc whether it is directly to dlc or through the onnx method. pytorch shows that `qnn.quantize` operator does not support and onnx shows that `quantizaliner` operator does not support. Does snpe not support the qat model?
pytorch qat model conversion to dlc failed
Posted: Mon, 2021-11-22 18:39
I hope you will reply, thx!
Dear customer,
We recently also met some similar issues.
Would you mind upload your model for us to have a try?
Thanks.
Dear,
SNPE can support the QAT model, as long as your training method is right, you can get related dlc.
QAT is just influence the accuracy, Do you have a try that whether you could convert your model without QAT training or not.
Sorry, we can't publish our model, but we can provide a demo here, which will encounter the same problem.
A resnet onnx model can be obtained through this author's script, you will encounter this problem when you switch to snpe
.https://github.com/pytorch/pytorch/pull/42835
The speed of the model without QAT is too slow, and the quantization function of snpe loses too much accuracy.
The script I trained was changed by myself according to the official pytorch training script. Converting to the onnx model is the method found in the official issue.
I can post these two links. When the model trained using these two scripts is converted, it will be prompted that some operators do not support.
https://github.com/pytorch/vision/blob/main/references/classification/train_quantization.py
http://github.com/pytorch/pytorch/pull/42835
Dear customer,
Our latest SNPE tools (1.55.x , 1.56.x , etc) support QAT ONNX now.
If your SDK version is too old, it might not support QAT onnx .
QAT pytorch is not supported yet , but you can choose --quantization_overrides and specify a json file to add quantization encodings.
Thanks.
thank you for your reply
My snpe version is already up to date.
I have provided a link from pytorch to onnx above. You will encounter problems when you transfer to snpe. Can you try it?