Hi Spark Users,
I am running some spark jobs which is running every hour.After running for
12 hours master is getting killed giving exception as
*java.lang.OutOfMemoryError: GC overhead limit exceeded*
It look like there is some memory issue in spark master.
Spark Master is blocker. Any one
Depends on the data volume that you are operating on.
Thanks
Best Regards
On Mon, Sep 28, 2015 at 5:12 PM, Saurav Sinha
wrote:
> Hi Akhil,
>
> My job is creating 47 stages in one cycle and it is running every hour.
> Can you please suggest me what is optimum numbers of
Hi Akhil,
Can you please explaine to me how increasing number of partition (which is
thing is worker nodes) will help.
As issue is that my master is getting OOM.
Thanks,
Saurav Sinha
On Mon, Sep 28, 2015 at 2:32 PM, Akhil Das
wrote:
> This behavior totally depends
This behavior totally depends on the job that you are doing. Usually
increasing the # of partitions will sort out this issue. It would be good
if you can paste the code snippet or explain what type of operations that
you are doing.
Thanks
Best Regards
On Mon, Sep 28, 2015 at 11:37 AM, Saurav