[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507862358 @kshitij12345 could you approve the PR if everything looks good to you now? thx This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507748706 @sxjscience Please help to review this PR. Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-507516653 I verified the result is the same as pytorch ``` import torch import numpy as np import math op = lambda x: torch.sigmoid(x) grad_op = lambda x: op(x) * (1 - op(x)) grad_grad_op = lambda x: grad_op(x) * (1 - 2 * op(x)) grad_grad_grad_op = lambda x: grad_grad_op(x) - 2 * ( grad_op(x)**2 + grad_grad_op(x) * op(x)) x = torch.tensor(np.array([1, 2, 3]), dtype=torch.float32) head_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.5 head_grad_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.6 head_grad_grad_grads = torch.tensor(np.array([1, 1, 1]), dtype=torch.float32) * 0.7 x.requires_grad = True head_grads.requires_grad = True y = op(x) x_grad = torch.autograd.grad(y, x, grad_outputs= head_grads, create_graph=True, retain_graph=True)[0] expected_grad_x = (grad_op(x) * head_grads).detach().numpy() print('expected_grad_x = {}'.format(expected_grad_x)) print('grad_x = {}'.format(x_grad.detach().numpy())) x_grad_grad = torch.autograd.grad(x_grad, x, grad_outputs= head_grad_grads, create_graph=True, retain_graph=True)[0] x_grad_grad.backward(head_grad_grad_grads) expected_grad_grad_x = (grad_grad_op(x) * head_grads * head_grad_grads).detach().numpy() expected_head_grad = (grad_op(x) * head_grad_grads).detach().numpy() expected_grad_grad_grad_x = (grad_grad_grad_op(x) * head_grads * head_grad_grads * head_grad_grad_grads).detach().numpy() print('expected_grad_grad_x = {}'.format(expected_grad_grad_x)) print('grad_grad_x = {}'.format(x_grad_grad.detach().numpy())) print('expected_grad_grad_grad_x = {}'.format(expected_grad_grad_grad_x)) print('grad_grad_grad_x = {}'.format(x.grad.detach().numpy())) ``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503887509 @larroy I also added the method to dump computation graph in imperative mode since it will be very useful for us to debug. However, it's still very rudimentary and we still need your help to implement a more elegant way of printing out the graph info. thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503887003 @larroy @sxjscience Please help review this PR. Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [incubator-mxnet] apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid
apeforest commented on issue #15288: [MXNET-978] Higher order gradient for sigmoid URL: https://github.com/apache/incubator-mxnet/pull/15288#issuecomment-503886931 @kshitij12345 I have figured out how backward works when one of the inputs is an output of the forward node. Please review this PR. Thanks! This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services