indhub commented on a change in pull request #11304: Added Learning Rate Finder 
tutorial
URL: https://github.com/apache/incubator-mxnet/pull/11304#discussion_r197523122
 
 

 ##########
 File path: docs/tutorials/gluon/learning_rate_finder.md
 ##########
 @@ -0,0 +1,321 @@
+
+# Learning Rate Finder
+
+Setting the learning rate for stochastic gradient descent (SGD) is crucially 
important when training neural network because it controls both the speed of 
convergence and the ultimate performance of the network. Set the learning too 
low and you could be twiddling your thumbs for quite some time as the 
parameters update very slowly. Set it too high and the updates will skip over 
optimal solutions, or worse the optimizer might not converge at all!
+
+Leslie Smith from the U.S. Naval Research Laboratory presented a method for 
finding a good learning rate in a paper called ["Cyclical Learning Rates for 
Training Neural Networks"](https://arxiv.org/abs/1506.01186). We take a look at 
the central idea of the paper, cyclical learning rate schedules, in the 
tutorial found here, but in this tutorial we implement a 'Learning Rate Finder' 
in MXNet with the Gluon API that you can use while training your own networks.
 
 Review comment:
   "but in this tutorial" - "but in this tutorial,"

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to