submit will fail due to not enough memory for rdd.
Ey-Chih Chow
--
From: moham...@glassbeam.com
To: eyc...@hotmail.com; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
Date: Fri, 30 Jan 2015 00:32:57 +
How much memory are you
.nabble.com/unknown-issue-in-submitting-a-spark-job-tp21418.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
:
http://apache-spark-user-list.1001560.n3.nabble.com/unknown-issue-in-submitting-a-spark-job-tp21418.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 1:06 AM
To: user@spark.apache.org
Subject: unknown issue in submitting a spark job
Hi,
I submitted a job using spark-submit and got the following exception.
Anybody knows how to fix this? Thanks.
Ey-Chih Chow
How much memory are you assigning to the Spark executor on the worker node?
Mohammed
From: ey-chih chow [mailto:eyc...@hotmail.com]
Sent: Thursday, January 29, 2015 3:35 PM
To: Mohammed Guller; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
The worker node has 15G
: unknown issue in submitting a spark job
Date: Thu, 29 Jan 2015 21:16:13 +
Looks like the application is using a lot more memory than available. Could
be a bug somewhere in the code or just underpowered machine. Hard to say
without looking at the code.
Caused
I use the default value, which I think is 512MB. If I change to 1024MB, Spark
submit will fail due to not enough memory for rdd.
Ey-Chih Chow
From: moham...@glassbeam.com
To: eyc...@hotmail.com; user@spark.apache.org
Subject: RE: unknown issue in submitting a spark job
Date: Fri, 30 Jan 2015 00