Does anybody have insight on this? Thanks.
On Fri, Jan 9, 2015 at 6:30 PM, Pala M Muthaia mchett...@rocketfuelinc.com
wrote:
Hi,
I am using Spark 1.0.1. I am trying to debug a OOM exception i saw during
a join step.
Basically, i have a RDD of rows, that i am joining with another RDD of
Hey Pala,
I also find it very hard to get to the bottom of memory issues such as this
one based on what's in the logs (so if you come up with some findings, then
please share here). In the interim, here are a few things you can try:
- Provision more memory per executor. While in theory (and
Hi,
I am using Spark 1.0.1. I am trying to debug a OOM exception i saw during a
join step.
Basically, i have a RDD of rows, that i am joining with another RDD of
tuples.
Some of the tasks succeed but a fair number failed with OOM exception with
stack below. The stack belongs to the 'reducer'