apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507862358
@kshitij12345 could you approve the PR if everything looks good to you now?
thx
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507748706
@sxjscience Please help to review this PR. Thanks!
This is
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507516653
I verified the result is the same as pytorch
```
import torch
import numpy as np
import math
op =
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503887509
@larroy I also added the method to dump computation graph in imperative mode
since it will be very useful for us to debug.
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503887003
@larroy @sxjscience Please help review this PR. Thanks!
This
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for
sigmoid
URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503886931
@kshitij12345 I have figured out how backward works when one of the inputs
is an output of the forward node. Please review