xinxilwl commented on PR #17587: URL: https://github.com/apache/tvm/pull/17587#issuecomment-2833113742
There are two methods to be consistent between attention and attention_bias. The one is combining func attention_bias with func attention, which leads to more work for test case changing as func attention_bias disappears, like this pr does. The other is adding func attention_bias in python/tvm/relax/op/nn/nn.py with func attention's code. The second method is easier to do and has smaller changes. Which one should we take? @yongwww @Hzfengsy @parsifal-47 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
