inadob commented on issue #4805: [Frontend][TFlite] Add parser support for relu6, leaky_relu, relu_n1_to_1, log_softmax URL: https://github.com/apache/incubator-tvm/pull/4805#issuecomment-592482232 > Relu and Clip implementation does not look right. > > We can keep the computation in integer domain. The way to do that is to subtract the input zero point, and then call Relu, then requantize to the output scale (only if output scale/zero point are different from input scale/zero point). That's fine for the standard ReLU op but I am not entirely sure whether we can follow this logic when we apply ReLU6 and ReLU1. Is there a way to recreate them in Relay without clip?. What I did, in this case, was to shift the data by subtracting the zero point, do clip, shift back and finally requantize if needed. The problem here was that `Clip()` from `RELAY_PASS_PATTERN ` needs the clipping range to be in `double` and I couldn't just fix this by casting to `float64` (it seems `double` is not supported in TVM). @anijain
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
