submit will fail due to not enough memory for rdd.
Ey-Chih Chow
--
From: moham...@glassbeam.com
To: eyc...@hotmail.com; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
Date: Fri, 30 Jan 2015 00:32:57 +
How much memory are you
Hi
There are 2 ways to resolve the issue.
1.Increasing the heap size, via -Xmx1024m (or more), or
2.Disabling the error check altogether, via -XX:-UseGCOverheadLimit.
as per
http://stackoverflow.com/questions/5839359/java-lang-outofmemoryerror-gc-overhead-limit-exceeded
you can pass the java
Looks like the application is using a lot more memory than available. Could be
a bug somewhere in the code or just underpowered machine. Hard to say without
looking at the code.
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
Mohammed
-Original Message-
From:
How much memory are you assigning to the Spark executor on the worker node?
Mohammed
From: ey-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 3:35 PM
To: Mohammed Guller; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
The worker node has 15G
: unknown issue in submitting a spark job
Date: Thu, 29 Jan 2015 21:16:13 +
Looks like the application is using a lot more memory than available. Could
be a bug somewhere in the code or just underpowered machine. Hard to say
without looking at the code.
Caused
I use the default value, which I think is 512MB. If I change to 1024MB, Spark
submit will fail due to not enough memory for rdd.
Ey-Chih Chow
From: moham...@glassbeam.com
To: eyc...@hotmail.com; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
Date: Fri, 30 Jan 2015 00