Re: SparkR query

2016-05-17 Thread Sun Rui
To: Mike Lewis > Cc: user@spark.apache.org > Subject: Re: SparkR query > > Lewis, > 1. Could you check the values of “SPARK_HOME” environment on all of your > worker nodes? > 2. How did you start your SparkR shell? > > On May 17, 2016, at 18:07, Mike Lewis <mle...@nephila

RE: SparkR query

2016-05-17 Thread Mike Lewis
Rui [mailto:sunrise_...@163.com] Sent: 17 May 2016 11:32 To: Mike Lewis Cc: user@spark.apache.org Subject: Re: SparkR query Lewis, 1. Could you check the values of “SPARK_HOME” environment on all of your worker nodes? 2. How did you start your SparkR shell? On May 17, 2016, at 18:07, Mike Lewis

Re: SparkR query

2016-05-17 Thread Sun Rui
Lewis, 1. Could you check the values of “SPARK_HOME” environment on all of your worker nodes? 2. How did you start your SparkR shell? > On May 17, 2016, at 18:07, Mike Lewis wrote: > > Hi, > > I have a SparkR driver process that connects to a master running on

SparkR query

2016-05-17 Thread Mike Lewis
Hi, I have a SparkR driver process that connects to a master running on Linux, I’ve tried to do a simple test, e.g. sc <- sparkR.init(master="spark://my-linux-host.dev.local:7077", sparkEnvir=list(spark.cores.max="4")) x <- SparkR:::parallelize(sc,1:100,2) y <- count(x) But I can see that the