dcslin commented on a change in pull request #753:
URL: https://github.com/apache/singa/pull/753#discussion_r447415096
##########
File path: python/singa/autograd.py
##########
@@ -117,7 +117,11 @@ def gradients(y, dy=None):
"""
grads = {} # mapping: x->dx if x.stores_grad
for p, dp in backward(y, dy):
- grads[p] = dp
+ # TODO: this fn is only helper for test case for now.
+ # 1. could implement __hash__ or
Review comment:
This `gradients()` is currently only used in 2 of the operations test
cases, so it does not affect other places.
`Opt.py` does backward itself, and not using this function.
This `gradients()` is more like "do backward and do NOT update and return
gradients". I did not directly move it into test scripts because in some cases,
user might want to get gradiants explicitly. Like pytorch `x.grad`.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]