PS you have a typo in "DEAMON" - its DAEMON. Thanks Latin.
On Mar 24, 2014 7:25 AM, "Sai Prasanna" <ansaiprasa...@gmail.com> wrote:

> Hi All !! I am getting the following error in interactive spark-shell
> [0.8.1]
>
>
>  *org.apache.spark.SparkException: Job aborted: Task 0.0:0 failed more
> than 0 times; aborting job java.lang.OutOfMemoryError: GC overhead limit
> exceeded*
>
>
> But i had set the following in the spark.env.sh and hadoop-env.sh
>
> export SPARK_DEAMON_MEMORY=8g
> export SPARK_WORKER_MEMORY=8g
> export SPARK_DEAMON_JAVA_OPTS="-Xms8g -Xmx8g"
> export SPARK_JAVA_OPTS="-Xms8g -Xmx8g"
>
>
> export HADOOP_HEAPSIZE=4000
>
> Any suggestions ??
>
> --
> *Sai Prasanna. AN*
> *II M.Tech (CS), SSSIHL*
>
>
>

Reply via email to