I want to close LayerNormToInferUnpack for layer_norm operator. But when I
close the LayerNormToInferUnpack, the TVMError occurs.
The error log is " File
"/home/zhongzheng.he/Project/stc-tvm/include/tvm/relay/op.h", line 534
TVMError: Check failed: idx < data_.size() && data_[idx].second != 0: Attribute
TOpPattern has not been registered for Operator nn.layer_norm
".
So when I add TOpPattern for operator nn.layer_norm, the TVMError does not
occur.
This is my patch:
-- a/src/relay/op/nn/nn.cc
+++ b/src/relay/op/nn/nn.cc
@@ -870,6 +870,7 @@ RELAY_REGISTER_OP("nn.layer_norm")
.add_argument("gamma", "Tensor", "The gamma scale factor.")
.add_argument("beta", "Tensor", "The beta offset factor.")
.set_support_level(1)
+.set_attr<TOpPattern>("TOpPattern", kOpaque)
.add_type_rel("LayerNorm", LayerNormRel);
So, why doesn't nn.layer_norm have the TOpPattern?
---
[Visit
Topic](https://discuss.tvm.ai/t/why-doesnt-nn-layer-norm-have-toppattern/7046/1)
to respond.
You are receiving this because you enabled mailing list mode.
To unsubscribe from these emails, [click
here](https://discuss.tvm.ai/email/unsubscribe/515bf07c1894ab6e05c69674defb11d91914b2d47ad3ba29efd0f07107dcc549).