To: Mike Lewis
> Cc: user@spark.apache.org
> Subject: Re: SparkR query
>
> Lewis,
> 1. Could you check the values of “SPARK_HOME” environment on all of your
> worker nodes?
> 2. How did you start your SparkR shell?
>
> On May 17, 2016, at 18:07, Mike Lewis <mle...@nephila
Rui [mailto:sunrise_...@163.com]
Sent: 17 May 2016 11:32
To: Mike Lewis
Cc: user@spark.apache.org
Subject: Re: SparkR query
Lewis,
1. Could you check the values of “SPARK_HOME” environment on all of your worker
nodes?
2. How did you start your SparkR shell?
On May 17, 2016, at 18:07, Mike Lewis
Lewis,
1. Could you check the values of “SPARK_HOME” environment on all of your worker
nodes?
2. How did you start your SparkR shell?
> On May 17, 2016, at 18:07, Mike Lewis wrote:
>
> Hi,
>
> I have a SparkR driver process that connects to a master running on
Hi,
I have a SparkR driver process that connects to a master running on Linux,
I’ve tried to do a simple test, e.g.
sc <- sparkR.init(master="spark://my-linux-host.dev.local:7077",
sparkEnvir=list(spark.cores.max="4"))
x <- SparkR:::parallelize(sc,1:100,2)
y <- count(x)
But I can see that the