Hello.
I'm sorry but did you find the answer?
I have the similar error and I can not solve it... No one answered me...
Spark driver dies and I get the error "Answer from Java side is empty".
I thought that it is so because I made a mistake this conf-file
I use Sparkling Water 1.6.3, Spark
Try reducing the number of workers to 2, and increasing their memory up to 6GB.
However I've seen mention of a bug in the pyspark API for when calling head()
on a dataframe in spark 1.5.0 and 1.4, it's got a big performance hit.
https://issues.apache.org/jira/browse/SPARK-10731
It's fixed in