Hi All !! I am getting the following error in interactive spark-shell
[0.8.1]


*org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed more than
0 times; aborting job java.lang.OutOfMemoryError: GC overhead limit
exceeded*


But i had set the following in the spark.env.sh and hadoop-env.sh

export SPARK_DEAMON_MEMORY=8g
export SPARK_WORKER_MEMORY=8g
export SPARK_DEAMON_JAVA_OPTS="-Xms8g -Xmx8g"
export SPARK_JAVA_OPTS="-Xms8g -Xmx8g"


export HADOOP_HEAPSIZE=4000

Any suggestions ??

-- 
*Sai Prasanna. AN*
*II M.Tech (CS), SSSIHL*

Reply via email to