locnd182644 commented on issue #18600:
URL: https://github.com/apache/tvm/issues/18600#issuecomment-3728195780

   
   ### Explain:
   - This issue happens due to checking shape before broadcast op.
   - Seems like shape bias in onnx graph is incorrect. Shape bias must equal 
out_channels (Ref in under) (specifically is 16). I don't know how onnxruntime 
and onnx.reference handle this, maybe they are using slicing to process.
   
   ### References:
   - ConvTranspose2d in PyTorch
   ```
   import torch
   class ConvTransposeModel(torch.nn.Module):
       def __init__(self, w, b):
           super(ConvTransposeModel, self).__init__()
           self.conv_transpose = torch.nn.ConvTranspose2d(
               in_channels=32,
               out_channels=16,
               kernel_size=(5, 5),
               stride=(1, 1),
               padding=(2, 2),
           )
           self.conv_transpose.weight = torch.nn.Parameter(w.detach())
           self.conv_transpose.bias = torch.nn.Parameter(b.detach())
       def forward(self, x):
           x = self.conv_transpose(x)
           return x
   
   weight = torch.rand(size=(32, 16, 5, 5))
   bias = torch.rand(size=(32,))
   mod = ConvTransposeModel(weight, bias)
   input_data = torch.rand(size=(1, 32, 28, 28))
   output = mod(input_data)
   ```
   The same error occurred when shape of bias = (32,).
   
   > File 
~/anaconda3/envs/tvm-build-venv/lib/python3.11/site-packages/torch/nn/modules/conv.py:1161,
 in ConvTranspose2d.forward(self, input, output_size)
   >    1150 num_spatial_dims = 2
   >    1151 output_padding = self._output_padding(
   >    1152     input,
   >    1153     output_size,
   >    (...)   1158     self.dilation,  # type: ignore[arg-type]
   >    1159 )
   > -> [1161] return F.conv_transpose2d(
   >    1162     input,
   >    1163     self.weight,
   >    1164     self.bias,
   >    1165     self.stride,
   >    1166     self.padding,
   >    1167     output_padding,
   >    1168     self.groups,
   >    1169     self.dilation,
   >    1170 )
   > 
   > RuntimeError: Given transposed=1, weight of size [32, 16, 5, 5], expected 
bias to be 1-dimensional with 16 elements, but got bias of size [32] instead
   
   <img width="870" height="217" alt="Image" 
src="https://github.com/user-attachments/assets/d8cd2fef-eb1d-4c2f-b841-01c72f941f6b";
 />
   
   - Citation Torch: 
https://docs.pytorch.org/docs/stable/generated/torch.nn.ConvTranspose2d.html#torch.nn.ConvTranspose2d
   
   <img width="600" height="300" alt="Image" 
src="https://github.com/user-attachments/assets/ca8e569b-1ec6-4c3d-982a-7bbb632f725d";
 />
   
   - Citation Onnx: https://onnx.ai/onnx/operators/onnx__ConvTranspose.html
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to