[ https://issues.apache.org/jira/browse/SPARK-2995?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14094973#comment-14094973 ]
Apache Spark commented on SPARK-2995: ------------------------------------- User 'mengxr' has created a pull request for this issue: https://github.com/apache/spark/pull/1913 > Allow to set storage level for intermediate RDDs in ALS > ------------------------------------------------------- > > Key: SPARK-2995 > URL: https://issues.apache.org/jira/browse/SPARK-2995 > Project: Spark > Issue Type: New Feature > Components: MLlib > Reporter: Xiangrui Meng > Assignee: Xiangrui Meng > > As mentioned in [SPARK-2465], using MEMORY_AND_DISK_SER together with > spark.rdd.compress=true can help reduce the space requirement by a lot, at > the cost of speed. It might be useful to add this option so people can run > ALS on much bigger datasets. -- This message was sent by Atlassian JIRA (v6.2#6252) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org