Hi,
I have implemented a custom Partitioner (org.apache.spark.Partitioner) that 
contains a medium-sized object (some megabytes). Unfortunately Spark (2.1.0) 
fails with a StackOverflowError, and I suspect it is because of the size of the 
partitioner that needs to be serialized. My question is, what is the maximum 
size of a Partitioner accepted by Spark?
Thanks!



Reply via email to