rutkoor commented on code in PR #15679:
URL: https://github.com/apache/tvm/pull/15679#discussion_r1324346506


##########
python/tvm/relax/transform/legalize_ops/manipulate.py:
##########
@@ -205,3 +213,16 @@ def te_layout_transform(data, name):
     output_dtype = call_args[0].struct_info.dtype
     output_sinfo = [TensorStructInfo(output_shape, output_dtype)]
     return call_tir(gvar, call_args, output_sinfo, tir_vars)
+
+
+@register_legalize("relax.remove_pad")
+def _remove_pad(bb: BlockBuilder, call: Call) -> Expr:
+    orig_shape = call.attrs.orig_shape
+
+    def te_remove_pad(data):
+        """
+        Returns a new compute that restrict the original expression to the 
shape of orig_shape
+        """
+        return te.compute(orig_shape, data, name="te_remove_pad")

Review Comment:
   Technically the behavior of remove_pad can be extended to remove padding 
along the given axis and given offset. In this PR, the support is added for 
AlterOp pass explicitly, to remove the padding which is being introduced by 
layout transform pass. In the next PR, I plan to work on generalizing 
remove_pad functionality to be generic.
   
   Will it be good idea to separate out this PR from generalizing remove_pad 
functionality?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@tvm.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to