Re: SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Thanks Shivaram. You got me wondering about the path so I put it in full
and it worked. R does not, of course, expand a ~.

On Thu, Aug 20, 2015 at 4:35 PM, Shivaram Venkataraman 
shiva...@eecs.berkeley.edu wrote:

 Can you check if the file
 `~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit` exists ? The
 error message seems to indicate it is trying to pick up Spark from
 that location and can't seem to find Spark installed there.

 Thanks
 Shivaram

 On Thu, Aug 20, 2015 at 3:30 PM, Deborah Siegel
 deborah.sie...@gmail.com wrote:
  Hello,
 
  I have previously successfully run SparkR in RStudio, with:
 
 Sys.setenv(SPARK_HOME=~/software/spark-1.4.1-bin-hadoop2.4)
 .libPaths(c(file.path(Sys.getenv(SPARK_HOME), R, lib),
 .libPaths()))
 library(SparkR)
 sc - sparkR.init(master=local[2],appName=SparkR-example)
 
 
  Then I tried putting some of it into an .Rprofile. It seemed to work to
 load
  the paths and SparkR, but I got an error when trying to create the sc. I
  then removed my .Rprofile, as well as .rstudio-desktop. However, I still
  cannot create the sc. Here is the error
 
  sc - sparkR.init(master=local[2],appName=SparkR-example)
  Launching java with spark-submit command
  ~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit   sparkr-shell
 
 /var/folders/p7/k1bpgmx93yd6pjq7dzf35gk8gn/T//RtmpOitA28/backend_port23377046db
  sh: ~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit: No such file
 or
  directory
  Error in sparkR.init(master = local[2], appName = SparkR-example) :
  JVM is not ready after 10 seconds
 
  I suspected there was an incomplete process or something. I checked for
 any
  running R or Java processes and there were none. Has someone seen this
 type
  of error? I have the same error in both RStudio and in R shell (but not
  sparkR wrapper).
 
  Thanks,
  Deb
 
 



SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Hello,

I have previously successfully run SparkR in RStudio, with:

Sys.setenv(SPARK_HOME=~/software/spark-1.4.1-bin-hadoop2.4)
.libPaths(c(file.path(Sys.getenv(SPARK_HOME), R, lib), .libPaths()))
library(SparkR)
sc - sparkR.init(master=local[2],appName=SparkR-example)


Then I tried putting some of it into an .Rprofile. It seemed to work to
load the paths and SparkR, but I got an error when trying to create the sc.
I then removed my .Rprofile, as well as .rstudio-desktop. However, I still
cannot create the sc. Here is the error

 sc - sparkR.init(master=local[2],appName=SparkR-example)
Launching java with spark-submit command
~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit   sparkr-shell
/var/folders/p7/k1bpgmx93yd6pjq7dzf35gk8gn/T//RtmpOitA28/backend_port23377046db
sh: ~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit: No such file or
directory
Error in sparkR.init(master = local[2], appName = SparkR-example) :
JVM is not ready after 10 seconds
I suspected there was an incomplete process or something. I checked for any
running R or Java processes and there were none. Has someone seen this type
of error? I have the same error in both RStudio and in R shell (but not
sparkR wrapper).

Thanks,
Deb