masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-694784492
Thanks @yongwww @t-vi
This is an automated message from the Apache Git Service.
To respond to the message
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-693774304
I also want to take some time studying how PyTorch ONNX export handles
inplace ops, and whether we can reuse some of their passes.
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-693773556
Yes, I think we can keep `clamp_`, but `copy_` seems to require true inplace
always, so we shouldn't support it (even if we add a conversion of `copy_`, the
output from TVM
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-693769682
I mean, what kind of real world PyTorch code ends up generating `copy_`?
Nobody writes `torch.copy_()` explicitly right?
-
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-693766450
Is there a valid usage of `copy_` that we can (and you want to) support?
From the discussion so far, I was under impression that `copy_` is always
generated from an idiom t
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-693107876
> @masahi Just checked the op list under torch.Tensor, there are a bunch of
inplace ops there, seems the op with suffix `_` is inplace one,
https://discuss.pytorch.org/t/ho
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-692943139
@yongwww to be clear, this translation works for box_coder from
maskrcnn-benchmark you mentioned above?
https://github.com/facebookresearch/maskrcnn-benchmark/blob/4ac0a3b4
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-692548630
> the graph is still the same as before
Yeah, this pass only supports a very small subset of inplace ops, see
https://github.com/pytorch/pytorch/blob/61623430d32c0
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-692504259
> What you would need to do is to preprocess the graph to remove these
side-effects - for example if you can exclude that the input to `clamp_` is a
view (e.g. because it c
masahi commented on pull request #6472:
URL: https://github.com/apache/incubator-tvm/pull/6472#issuecomment-692460614
yeah, that is also what happens with other inplace ops. TorchScript graph
has distinct input/output tensors for each op, even though for inplace op
output and one of input
10 matches
Mail list logo