hetong007 commented on a change in pull request #11374: [MXNET-563] Refactor R 
optimizers to fix memory leak
URL: https://github.com/apache/incubator-mxnet/pull/11374#discussion_r202840846
 
 

 ##########
 File path: R-package/R/optimizer.R
 ##########
 @@ -159,41 +209,60 @@ mx.opt.rmsprop <- function(learning.rate=0.002,
 #'      L2 regularization coefficient add to all the weights.
 #' @param rescale.grad float, default=1.0
 #'      rescaling factor of gradient.
-#' @param clip_gradient float, optional
+#' @param clip_gradient float, optional, default=-1
 #'      clip gradient in range [-clip_gradient, clip_gradient].
 #' @param lr_scheduler function, optional
 #'      The learning rate scheduler.
 #'
-mx.opt.adam <- function(learning.rate=0.001,
-                        beta1=0.9,
-                        beta2=0.999,
-                        epsilon=1e-8,
-                        wd=0,
-                        rescale.grad=1,
-                        clip_gradient = NULL,
+mx.opt.adam <- function(learning.rate = 1e-3,
+                        beta1 = 0.9,
+                        beta2 = 0.999,
+                        epsilon = 1e-8,
+                        wd = 0,
+                        rescale.grad = 1,
+                        clip_gradient = -1,
 
 Review comment:
   what is the difference between setting it to `1` and `-1` ?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to