aaronmarkham commented on a change in pull request #16500: Fixing broken links
URL: https://github.com/apache/incubator-mxnet/pull/16500#discussion_r335664581
 
 

 ##########
 File path: docs/python_docs/python/tutorials/packages/autograd/index.md
 ##########
 @@ -159,7 +159,7 @@ print('is_training:', is_training, output)
 
 We called `dropout` while `autograd` was recording this time, so our network 
was in training mode and we see dropout of the input this time. Since the 
probability of dropout was 50%, the output is automatically scaled by 1/0.5=2 
to preserve the average activation.
 
-We can force some operators to behave as they would during training, even in 
inference mode. One example is setting `mode='always'` on the 
[Dropout](https://mxnet.apache.org/api/python/ndarray/ndarray.html?highlight=dropout#mxnet.ndarray.Dropout)
 operator, but this usage is uncommon.
+We can force some operators to behave as they would during training, even in 
inference mode. One example is setting `mode='always'` on the 
[Dropout](/api/python/ndarray/ndarray.html?highlight=dropout#mxnet.ndarray.Dropout)
 operator, but this usage is uncommon.
 
 Review comment:
   ```suggestion
   We can force some operators to behave as they would during training, even in 
inference mode. One example is setting `mode='always'` on the 
[Dropout](/api/python/ndarray/ndarray.html#mxnet.ndarray.Dropout) operator, but 
this usage is uncommon.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to