larroy removed a comment on issue #14992: [MXNET-978] Support higher order 
gradient for `log`.
URL: https://github.com/apache/incubator-mxnet/pull/14992#issuecomment-496750425
 
 
   "I believe it would be better to have gradients defined for existing 
backward, instead of a differentiable gradient (relying on autograd machinery) 
at least on ops where backward is not trivial. It will allow to use existing 
optimised fused kernels and make sure there is no regression in the backward."
   
   I think you are right, we have discussed this before. I think we rushed 
merging this PR. We should do a check for regressions.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to