[ https://issues.apache.org/jira/browse/SPARK-6867?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14506361#comment-14506361 ]
Rakesh Chalasani commented on SPARK-6867: ----------------------------------------- The PR submitted before is made into a spark-package and is available here http://spark-packages.org/package/rakeshchalasani/MLlib-dropout > Dropout regularization > ---------------------- > > Key: SPARK-6867 > URL: https://issues.apache.org/jira/browse/SPARK-6867 > Project: Spark > Issue Type: New Feature > Components: MLlib > Reporter: Rakesh Chalasani > Priority: Minor > > Linear models is MLLIB so far support no regularization, L1 and L2. Another > more recently popularized method for regularization is dropout > [http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf]. The dropout > regularization basically randomly omit some of the input features at each > iteration. > Though this approach is particularly used in training deep networks, they > could also be very useful on a linear models as if promotes adaptive > regularization. This approach is particularly useful in NLP > [http://papers.nips.cc/paper/4882-dropout-training-as-adaptive-regularization.pdf] > and, because of its simplicity can be easily adopted for streaming linear > models as well. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org