Forums - about the warning: WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported

3 posts / 0 new
Last post
about the warning: WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported
mons2us
Join Date: 20 Feb 24
Posts: 4
Posted: Tue, 2024-03-12 01:01

Hi. I'm trying to develop a pipeline where first starts from converting torch model(.pt) to onnx and then to .dlc file.

the torch model I'm using uses grid_sample function, and onnx requires 16 or higher opset version for that.

However with 16 opset version, I get warning as below;

$ snpe-onnx-to-dlc -i testnet.onnx --out_node output -o testnet.dlc           
/opt/qcom/aistack/snpe/latest/lib/python/qti/aisw/converters/onnx/rnn_translations.py:411: SyntaxWarning: "is not" with a literal. Did you mean "!="?
  return [name for name in formatted_input_names if name is not '']
2024-03-12 07:53:02,513 - 235 - INFO - Successfully simplified the onnx model in child process
2024-03-12 07:53:02,758 - 235 - INFO - Successfully receive the simplified onnx model in main process
2024-03-12 07:53:03,141 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,142 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,143 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,144 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,145 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,146 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,147 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,148 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,149 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,150 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,153 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,154 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,156 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,159 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,163 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,166 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,169 - 240 - WARNING - WARNING_OP_VERSION_NOT_SUPPORTED: Operation Relu Not Supported. Expected operator version: [1, 6], instead got version: [14]
2024-03-12 07:53:03,171 - 240 - WARNING - WARNING_GEMM: GEMM operation is not supported in the general case, attempting to interpret as FC
2024-03-12 07:53:03,315 - 235 - INFO - INFO_INITIALIZATION_SUCCESS: 
2024-03-12 07:53:03,555 - 235 - INFO - INFO_CONVERSION_SUCCESS: Conversion completed successfully
2024-03-12 07:53:03,592 - 235 - INFO - INFO_WRITE_SUCCESS: 
Ignoring the warnings ends up with snpe quantization failure.
With opset version 11, no problem with ReLU but problem with grid_sampler...
 
Is there any solution for this issue?
  • Up0
  • Down0
sanjjey.a.sanjjey
Join Date: 17 May 22
Posts: 55
Posted: Tue, 2024-03-12 07:26

Hi, 

Can you try it by setting your opset version to 6.

Thanks

  • Up0
  • Down0
mons2us
Join Date: 20 Feb 24
Posts: 4
Posted: Wed, 2024-03-13 21:35

Hi, thanks for the reply.

by executing the code below,

torch.onnx.export(
model,
args=(y_0, y_1),
f='testnet.onnx',
export_params=True,
opset_version=6,
do_constant_folding=True,
input_names=['y_0', 'y_1'],
output_names=['output'],
dynamic_axes={'output': {0: 'batch_size'}})

I get the following error; that the verion 6 is not supported.

File ~/anaconda3/envs/dl_env/lib/python3.8/site-packages/torch/onnx/utils.py:506, in export(model, args, f, export_params, verbose, training, input_names, output_names, operator_export_type, opset_version, do_constant_folding, dynamic_axes, keep_initializers_as_inputs, custom_opsets, export_modules_as_functions) 188 @_beartype.beartype 189 def export( 190 model: Union[torch.nn.Module, torch.jit.ScriptModule, torch.jit.ScriptFunction], (...) 206 export_modules_as_functions: Union[bool, Collection[Type[torch.nn.Module]]] = False, 207 ) -> None: 208 r"""Exports a model into ONNX format. 209

...

73 if value not in supported_versions: ---> 74 raise ValueError(f"Unsupported ONNX opset version: {value}") 75 self._export_onnx_opset_version = value ValueError: Unsupported ONNX opset version: 6

Maybe I'm doing sth wrong?

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.