dynamic allocation in Spark 2.0

2016-08-24 Thread Shane Lee
Hello all, I am running hadoop 2.6.4 with Spark 2.0 and I have been trying to get dynamic allocation to work without success. I was able to get it to work with Spark 16.1 however. When I issue the commandspark-shell --master yarn --deploy-mode client this is the error I see: 16/08/24 00:05:40

Re: SparkR error when repartition is called

2016-08-09 Thread Shane Lee
e.Could you give more environment information? On Aug 9, 2016, at 11:35, Shane Lee <shane_y_...@yahoo.com.INVALID> wrote: Hi All, I am trying out SparkR 2.0 and have run into an issue with repartition.  Here is the R code (essentially a port of the pi-calculating scala example in the s

SparkR error when repartition is called

2016-08-08 Thread Shane Lee
Hi All, I am trying out SparkR 2.0 and have run into an issue with repartition.  Here is the R code (essentially a port of the pi-calculating scala example in the spark package) that can reproduce the behavior: schema <- structType(structField("input", "integer"), structField("output",