larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295007167
 
 

 ##########
 File path: docs/api/python/autograd/autograd.md
 ##########
 @@ -76,7 +82,63 @@ Detailed tutorials are available in Part 1 of
 [the MXNet gluon book](http://gluon.mxnet.io/).
 
 
+# Higher order gradient
+
+Some operators support higher order gradients. Meaning that you calculate the 
gradient of the
+gradient. For this the operator's backward must be differentiable as well. 
Some operators support
+differentiating multiple times, and others two, most just once.
+
+For calculating higher order gradients, we can use the `mx.autograd.grad` 
function while recording
+and then call backward, or call `mx.autograd.grad` two times. If we do the 
latter, is important that
+the first call uses `create_graph=True` and `retain_graph=True` and the second 
call uses
+`create_graph=False` and `retain_graph=True`. Otherwise we will not get the 
results that we want. If
+we would be to recreate the graph in the second call, we would end up with a 
graph of just the
+backward nodes, not the full initial graph that includes the forward nodes.
+
+The pattern to calculate higher order gradients is the following:
+
+```python
+from mxnet import ndarray as nd
+from mxnet import autograd as ag
+x=nd.array([1,2,3])
+x.attach_grad()
+def f(x):
+    # A function which supports higher oder gradients
+    return nd.log(x)
+```
+
+If the operators used in `f` don't support higher order gradients you will get 
an error like
+`operator ... is non-differentiable because it didn't register FGradient 
attribute.`. This means
+that it doesn't support getting the gradient of the gradient. Which is, 
running backward on
+the backward graph.
+
+Using mxnet.autograd.grad multiple times:
+
+```python
+with ag.record():
+    y = f(x)
+    x_grad = ag.grad(y, x, create_graph=True, retain_graph=True)[0]
 
 Review comment:
   Why do you want to make it more complex with head gradients?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to