alprnbg opened a new issue #8880:
URL: https://github.com/apache/tvm/issues/8880


   Greetings. I encounted the following error while converting a jitted torch 
model, in which I created multiple boolean tensors whose values never change 
for different inputs. 
   ```
    File "/home/libraries/tvm/python/tvm/relay/frontend/pytorch.py", line 3330, 
in from_pytorch
       ret = converter.convert_operators(_get_operator_nodes(graph.nodes()), 
outputs, ret_name)[0]
     File "/home/libraries/tvm/python/tvm/relay/frontend/pytorch.py", line 
2751, in convert_operators
       inputs, _get_input_types(op_node, outputs, 
default_dtype=self.default_dtype)
     File "/home/libraries/tvm/python/tvm/relay/frontend/pytorch.py", line 675, 
in zeros
       dtype = _convert_dtype_value(inputs[1])
     File "/home/libraries/tvm/python/tvm/relay/frontend/pytorch.py", line 
2824, in _convert_dtype_value
       raise NotImplementedError(msg)
   NotImplementedError: Torch data type value 11 is not handled yet.
   ```
   When I looked at tvm/python/tvm/relay/frontend/pytorch.py, I saw that 
_convert_dtype_value calls _convert_data_type which supports torch.bool, i 
guess.  So, I added the item (11: "torch.bool") into the 
convert_torch_dtype_map and my model was converted successfully. I also 
compared the results of the tvm and jit models.
   
   Is what I have done wrong or should we update convert_torch_dtype_map 
dictionary inside _convert_dtype_value function?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to