Forums - Conv3d

2 posts / 0 new
Last post
Conv3d
ktjktj0911
Join Date: 2 Jan 23
Posts: 14
Posted: Sun, 2023-07-30 15:38

Hello everyone,

I am currenlty working on SNPE2.12 and QNN2.12 and both of the have exact same problem.

I am trying to run conv3d unet architecture on htp however, decoder part does not work on snpe due to conv3dTranspose.

Thus, I have tried to mimic conv3dtranspose by using operation similar to pixelshuffle (by increasing channel and reshape to increase width, height, depth). 

However, this also gives error about the rank. It seems like reshape of 5d tensor not supported on snpe.

Here the model that I have tested.

input = tf.keras.layers.Input((64,64,48,1))
x = tf.keras.layers.Conv3D(filters=8, kernel_size=3, strides=(2,2,2), padding="same") (input)
x = tf.keras.layers.Conv3D(filters=16, kernel_size=3, strides=(2,2,2), padding="same") (x)
x = tf.keras.layers.Conv3D(filters=32, kernel_size=3, strides=(2,2,2), padding="same") (x)
 
x = tf.keras.layers.Conv3D(filters=256, kernel_size=1, padding="same") (x)
x = tf.reshape(x, (-1, 16,16,12,32))
x = tf.keras.layers.Conv3D(filters=16, kernel_size=3, padding="same") (x)

 

x = tf.keras.layers.Conv3D(filters=128, kernel_size=1, padding="same") (x)
x = tf.reshape(x, (-1, 32,32,24,16))
x = tf.keras.layers.Conv3D(filters=8, kernel_size=3, padding="same") (x)

 

x = tf.keras.layers.Conv3D(filters=64, kernel_size=1, padding="same") (x)
x = tf.reshape(x, (-1, 64,64,48,8))
x = tf.keras.layers.Conv3D(filters=1, kernel_size=3, padding="same") (x)
output = tf.identity(x)
model = tf.keras.Model(input, output)
tf.keras.models.save_model(model, "3dconv")
 

Is there way to use conv3d transpose or reshape 5d tensor?

thank you

  • Up0
  • Down0
yunxqin
Join Date: 2 Mar 23
Posts: 44
Posted: Fri, 2023-09-08 19:12

Dear developer,

You can use more newer version of snpe to support 5D reshape nodes

BR.

Yunxiang

  • Up0
  • Down0
or Register

Opinions expressed in the content posted here are the personal opinions of the original authors, and do not necessarily reflect those of Qualcomm Incorporated or its subsidiaries (“Qualcomm”). The content is provided for informational purposes only and is not meant to be an endorsement or representation by Qualcomm or any other party. This site may also provide links or references to non-Qualcomm sites and resources. Qualcomm makes no representations, warranties, or other commitments whatsoever about any non-Qualcomm sites or third-party resources that may be referenced, accessible from, or linked to this site.