nvalid>>
Sent: Tuesday, August 9, 2016 12:19 AM
Subject: Re: SparkR error when repartition is called
To: Sun Rui <sunrise_...@163.com<mailto:sunrise_...@163.com>>
Cc: User <user@spark.apache.org<mailto:user@spark.apache.org>>
Sun,
I am using spark in yarn client mode i
Sun,
I am using spark in yarn client mode in a 2-node cluster with hadoop-2.7.2. My
R version is 3.3.1.
I have the following in my spark-defaults.conf:spark.executor.extraJavaOptions
=-XX:+PrintGCDetails
I can’t reproduce your issue with len=1 in local mode.
Could you give more environment information?
> On Aug 9, 2016, at 11:35, Shane Lee wrote:
>
> Hi All,
>
> I am trying out SparkR 2.0 and have run into an issue with repartition.
>
> Here is the R code
Hi All,
I am trying out SparkR 2.0 and have run into an issue with repartition.
Here is the R code (essentially a port of the pi-calculating scala example in
the spark package) that can reproduce the behavior:
schema <- structType(structField("input", "integer"), structField("output",