Hi,

My Spark application throws stackoverflow exceptions after a while.
The DAGScheduler function submitMissingTasks tries to serialize a Tuple 
(MapPartitionsRDD, EsSpark..saveToEs) which is handled with a recursive 
algorithm.
The recursive algorithm is too deep and results in a stackoverflow exception.

Should I just try to increase the heap size? Or will it just happen later in 
time?

How can I fix this?

With kind regards,
michel



[cid:image001.png@01D1B7F5.8E8C3980]

Reply via email to