java.lang.OutOfMemoryError: Java heap space when running job via spark-submit

2014-10-09 Thread Jaonary Rabarisoa
Dear all, I have a spark job with the following configuration *val conf = new SparkConf()* * .setAppName(My Job)* * .set(spark.serializer, org.apache.spark.serializer.KryoSerializer)* * .set(spark.kryo.registrator, value.serializer.Registrator)* * .setMaster(local[4])* *

Re: java.lang.OutOfMemoryError: Java heap space when running job via spark-submit

2014-10-09 Thread Jaonary Rabarisoa
in fact with --driver-memory 2G I can get it working On Thu, Oct 9, 2014 at 6:20 PM, Xiangrui Meng men...@gmail.com wrote: Please use --driver-memory 2g instead of --conf spark.driver.memory=2g. I'm not sure whether this is a bug. -Xiangrui On Thu, Oct 9, 2014 at 9:00 AM, Jaonary Rabarisoa