Also for the record, turning on kryo was not able to help.
On Tue, Aug 23, 2016 at 12:58 PM, Arun Luthra wrote:
> Splitting up the Maps to separate objects did not help.
>
> However, I was able to work around the problem by reimplementing it with
> RDD joins.
>
> On Aug
Splitting up the Maps to separate objects did not help.
However, I was able to work around the problem by reimplementing it with
RDD joins.
On Aug 18, 2016 5:16 PM, "Arun Luthra" wrote:
> This might be caused by a few large Map objects that Spark is trying to
>
This might be caused by a few large Map objects that Spark is trying to
serialize. These are not broadcast variables or anything, they're just
regular objects.
Would it help if I further indexed these maps into a two-level Map i.e.
Map[String, Map[String, Int]] ? Or would this still count against
I got this OOM error in Spark local mode. The error seems to have been at
the start of a stage (all of the stages on the UI showed as complete, there
were more stages to do but had not showed up on the UI yet).
There appears to be ~100G of free memory at the time of the error.
Spark 2.0.0
200G