Hi there, I am trying to get the gradients of some popular models, however it
seems that TVM does not register gradients for `nn.batch_norm` operators
currently, is there way to register gradients for unsupported OPs?
```
model = nn.Sequential(
nn.Conv2d(3, 3, kernel_size=3, padding=1),
nn.BatchNorm2d(3)
)
# Grab the TorchScripted model via tracing
input_shape = [1, 3, 32, 32]
input_data = torch.randn(input_shape)
scripted_model = torch.jit.trace(model, input_data).eval()
input_name = "input0"
shape_list = [(input_name, input_data.shape)]
mod, params = relay.frontend.from_pytorch(scripted_model, shape_list)
mod = relay.transform.InferType()(mod)
bwd_mod = relay.transform.gradient(mod['main'], mode="first_order")
# >> running results
"""
the operator nn.batch_norm does not have a registered gradient.
1 mod = relay.transform.InferType()(mod)
----> 2 bwd_mod = relay.transform.gradient(mod['main'], mode="first_order")
3 bwd_mod
~/Workspace/tvm/python/tvm/relay/transform/transform.py in gradient(expr, mod,
mode)
"""
```
# BatchNorm 2d
---
[Visit
Topic](https://discuss.tvm.apache.org/t/errors-when-obtaining-gradients-for-nn-batch-norm/11304/1)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.apache.org/email/unsubscribe/2e570bf176f740cea745fff647f6b53025cf1ab50e671a207d6a88305e7d7328).