Spark ClosureCleaner or java serializer OOM when trying to grow

2015-09-23 Thread jluan
y about 3000 or so are non-zeros. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796.html Sent from the Apache Spark User List mailing list archiv

Re: Spark ClosureCleaner or java serializer OOM when trying to grow

2015-09-24 Thread Ted Yu
torage.memoryFraction 0.9 > spark.storage.shuffleFraction 0.05 > spark.default.parallelism 128 > > The master machine has approximately 240 GB of ram and each worker has > about > 120GB of ram. > > I load in a relatively tiny RDD of MLLIB LabeledPoint objects, with each

Re: Spark ClosureCleaner or java serializer OOM when trying to grow

2015-09-24 Thread jluan
eaturesInfo, numTrees, featureSubsetStrategy, impurity, maxDepth, maxBins) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796p24818.html Sent from the Apache Spark User List mailing lis

Re: Spark ClosureCleaner or java serializer OOM when trying to grow

2015-11-14 Thread rohangpatil
ss "Word2VecApp" --master local[30] target/scala-2.10/word2vec-project_2.10-1.0.jar -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ClosureCleaner-or-java-serializer-OOM-when-trying-to-grow-tp24796p25383.