Forums - pytorch qat model conversion to dlc failed

7 posts / 0 new
Last post
pytorch qat model conversion to dlc failed
martinchenypl1
Join Date: 13 Jul 21
Posts: 4
Posted: Mon, 2021-11-22 18:39

I trained a qat model, I got an error when I tried to convert it to dlc whether it is directly to dlc or through the onnx method. pytorch shows that `qnn.quantize` operator does not support and onnx shows that `quantizaliner` operator does not support. Does snpe not support the qat model?

I hope you will reply, thx!
  • Up0
  • Down0
zhiguol
Join Date: 16 Dec 19
Posts: 16
Posted: Mon, 2021-11-22 23:10

Dear customer, 

We recently also met some similar issues. 

Would you mind upload your model for us to have a try?

Thanks.

  • Up0
  • Down0
whuo
Join Date: 20 Jun 21
Posts: 1
Posted: Mon, 2021-11-22 23:32

Dear,

SNPE can support the QAT model, as long as your training method is right,  you can get related dlc.

QAT is  just influence the accuracy,  Do you have a try that whether you could convert your model without QAT training or not.

  • Up1
  • Down0
martinchenypl1
Join Date: 13 Jul 21
Posts: 4
Posted: Mon, 2021-11-22 23:33

Sorry, we can't publish our model, but we can provide a demo here, which will encounter the same problem.

A resnet onnx model can be obtained through this author's script, you will encounter this problem when you switch to snpe

.https://github.com/pytorch/pytorch/pull/42835

 

  • Up0
  • Down0
martinchenypl1
Join Date: 13 Jul 21
Posts: 4
Posted: Mon, 2021-11-22 23:39

The speed of the model without QAT is too slow, and the quantization function of snpe loses too much accuracy. 

The script I trained was changed by myself according to the official pytorch training script. Converting to the onnx model is the method found in the official issue. 

I can post these two links. When the model trained using these two scripts is converted, it will be prompted that some operators do not support.

https://github.com/pytorch/vision/blob/main/references/classification/train_quantization.py

http://github.com/pytorch/pytorch/pull/42835

  • Up0
  • Down0
hwu5
Join Date: 22 Nov 21
Posts: 3
Posted: Tue, 2021-11-23 00:34

Dear customer,

Our latest SNPE tools (1.55.x , 1.56.x , etc) support QAT ONNX now. 

If your SDK version is too old, it might not support QAT onnx .

QAT pytorch is not supported yet , but you can choose --quantization_overrides and specify a json file to add quantization encodings.

Thanks.

 

  • Up0
  • Down0
martinchenypl1
Join Date: 13 Jul 21
Posts: 4
Posted: Tue, 2021-11-23 01:01

thank you for your reply

My snpe version is already up to date.

I have provided a link from pytorch to onnx above. You will encounter problems when you transfer to snpe. Can you try it?

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.