Are you getting OutOfMemory on the driver or on the executor? Typical cause
of OOM in Spark can be due to fewer number of tasks for a job.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29117.html
Sent
Increase the cores, as you're trying to run multiple threads
Sent from Naga iPad
> On Aug 22, 2017, at 3:26 PM, "u...@moosheimer.com"
> wrote:
>
> Since you didn't post any concrete information it's hard to give you an
> advice.
>
> Try to increase the executor memory
Since you didn't post any concrete information it's hard to give you an advice.
Try to increase the executor memory (spark.executor.memory).
If that doesn't help give all the experts in the community a chance to help you
by adding more details like version, logfile, source etc
Mit
Any help here will be appreciated.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29096.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.