tinywisdom opened a new issue, #18365:
URL: https://github.com/apache/tvm/issues/18365
### Summary
Importing a torch.exported model that calls F.interpolate(...,
mode='bilinear', align_corners=False, antialias=True) fails in the TVM Relax
Torch frontend with:
```
AssertionError: Unsupported function types ['_upsample_bilinear2d_aa.vec']
```
PyTorch emits a dedicated ATen op for anti-aliased bilinear:
aten::_upsample_bilinear2d_aa.vec. The importer currently doesn’t cover this op.
### Environment
- OS: (Ubuntu 22.04.4 LTS (x86_64))
- TVM version: (release v0.21.0)
- Python: (3.10.16)
- LLVM: (17.0.6)
- Pytorch: (2.7.1)
### Steps to reproduce
```python
import torch
import torch.nn as nn
import torch.nn.functional as F
class AAInterp(nn.Module):
def __init__(self):
super().__init__()
def forward(self, x):
# bilinear + antialias=True -> emits
aten::_upsample_bilinear2d_aa.vec
return F.interpolate(x, size=(64, 64), mode="bilinear",
align_corners=False, antialias=True)
def main():
from torch.export import export as torch_export
from tvm.relax.frontend.torch import from_exported_program
model = AAInterp().eval()
inp = torch.randn(1, 3, 32, 32)
with torch.inference_mode():
_ = model(inp) # warmup
ep = torch_export(model, (inp,))
mod = from_exported_program(ep) # <- raises assertion
print(mod)
if __name__ == "__main__":
main()
```
### Output
```
Traceback (most recent call last):
...
File ".../base_fx_graph_translator.py", line 116, in
_check_unsupported_func_type
assert not missing_func_types, f"Unsupported function types
{missing_func_types}"
AssertionError: Unsupported function types ['_upsample_bilinear2d_aa.vec']
```
### Triage
* needs-triage
* bug
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]