Hello all,
I am running hadoop 2.6.4 with Spark 2.0 and I have been trying to get dynamic
allocation to work without success. I was able to get it to work with Spark
16.1 however.
When I issue the commandspark-shell --master yarn --deploy-mode client
this is the error I see:
16/08/24 00:05:40
e.Could you give more
environment information?
On Aug 9, 2016, at 11:35, Shane Lee <shane_y_...@yahoo.com.INVALID> wrote:
Hi All,
I am trying out SparkR 2.0 and have run into an issue with repartition.
Here is the R code (essentially a port of the pi-calculating scala example in
the s
Hi All,
I am trying out SparkR 2.0 and have run into an issue with repartition.
Here is the R code (essentially a port of the pi-calculating scala example in
the spark package) that can reproduce the behavior:
schema <- structType(structField("input", "integer"), structField("output",