[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-19 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-503434484 @kshitij12345 There was some issue with CI recently. Could you please re-trigger it one more time? Sorry for the inconvenience.

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-17 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-502828155 @kshitij12345 still one GPU test failed. I looked at the log and don't find it related to your change. Could you please rebase and trigger CI one

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-13 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-501822329 @kshitij12345 Could you please rebase and retrigger CI again? Thanks! This is an

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-10 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-500550200 There is no available utility to print out value from NDArray. I used to write a for loop to iterate the `dptr` pointer and dump out values.

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-10 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-500511759 > @apeforest , Thank You for explaining how to get the dump of the graph. Waiting the PR which simplifies that. > > Also could you tell me

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-07 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-500055762 > `y_grad` may not have any useful meaning as of now, however I believe that we should test for the value using case 1.2, just to verify and

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-07 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-500054793 > Also it would be really great if you can tell me how to get the dump of the graph that would be really helpful. @kshitij12345 There is a

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-06 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499694593 As a follow up, I just dumped out the computation graph in case 2. Indeed, the node that used to calculate y_grad.grad is not even in the final

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-06 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499608736 @kshitij12345 I think it's because of the design of backward computation graph in MXNet. In C++ implementation, when you specify variables=x, it

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-06 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499599418 > ```python > x_grad = x_grad_mid * y_grad # Note this part. > ``` I don't think this part is correct. Note: your x_grad_mid

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-05 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499287059 @kshitij12345 The computation graph for the second backward pass makes sense to me. As you can see there is only one output from the graph, that

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-05 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499202039 @larroy Yes, I agree with your reply. Also, I don't understand the meaning (or need) to return dL/dy_grad. @kshitij12345 Please comment. Thanks

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-05 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499192329 I did some more probing. I think the reason that head_grads.grad being all zeros is the variable head_grads was not specified during the second

[GitHub] [incubator-mxnet] apeforest commented on issue #15120: [bug] fix higher grad log

2019-06-05 Thread GitBox
apeforest commented on issue #15120: [bug] fix higher grad log URL: https://github.com/apache/incubator-mxnet/pull/15120#issuecomment-499170409 @kshitij12345 I have some question about the equation `expected_head_grad = (grad_op(x) * head_grad_grads).asnumpy()` in your test. My