Btw the node only has 4GB memory so does the spark.executor.memory make
sense...
Should i instead make it around 2-3GB. ALso how different is this parameter
from SPARK_MEM
Thanks,
Saurabh
On Fri, Dec 6, 2013 at 8:26 AM, learner1014 all wrote:
> Still see a whole lot of following erros
> java.la
Still see a whole lot of following erros
java.lang.OutOfMemoryError: Java heap space
13/12/05 16:04:13 INFO executor.StandaloneExecutorBackend: Got assigned
task 553
13/12/05 16:04:13 INFO executor.Executor: Running task ID 553
Issue seems to be that the process hangs as we are probably performing
Try allocating some more resources to your application.
You seem to be using 512Mb for you worker node - (you can verify that from
the master UI)
Try putting the following settings into your code and see if it helps -
System.setProperty("spark.executor.memory","15g") // Will allocate more
memor
Hi,
Trying to do a join operation on an RDD, my input is pipe delimited data
and there are 2 files.
One file is 24MB and the other file is 285MB.
Setup being used is the single node (server) setup: SPARK_MEM set to 512m
Master
/pkg/java/jdk1.7.0_11/bin/java -cp
:/spark-0.8.0-incubating-bin-cdh4/c