never mind, all of this was caused because somewhere in my code I wrote `def`
instead of `val`, which caused `collectAsMap` to be executed on each call.
Not sure why Spark at some point decided to create a new context, though...
Anyway, sorry for the disturbance.
sstraub wrote
> Hi,
>
Hi,
I'm working on a spark job that frequently iterates over huge RDDs and
matches the elements against some Maps that easily fit into memory. So what
I do is to broadcast that Map and reference it from my RDD.
Works like a charm, until at some point it doesn't, and I can't figure out
why...