Are you getting OutOfMemory on the driver or on the executor? Typical cause
of OOM in Spark can be due to fewer number of tasks for a job.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29117.html
Sent
t;> Am 22.08.2017 um 20:16 schrieb shitijkuls <kulshreshth...@gmail.com>:
>>
>> Any help here will be appreciated.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemor
freundlichen Grüßen / best regards
Kay-Uwe Moosheimer
> Am 22.08.2017 um 20:16 schrieb shitijkuls <kulshreshth...@gmail.com>:
>
> Any help here will be appreciated.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-
Any help here will be appreciated.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-submit-OutOfMemory-Error-in-local-mode-tp29081p29096.html
Sent from the Apache Spark User List mailing list archive at Nabble.com