Re: SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Thanks Shivaram. You got me wondering about the path so I put it in full and it worked. R does not, of course, expand a "~". On Thu, Aug 20, 2015 at 4:35 PM, Shivaram Venkataraman < shiva...@eecs.berkeley.edu> wrote: > Can you check if the file > `~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-su

Re: SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Shivaram Venkataraman
Can you check if the file `~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit` exists ? The error message seems to indicate it is trying to pick up Spark from that location and can't seem to find Spark installed there. Thanks Shivaram On Thu, Aug 20, 2015 at 3:30 PM, Deborah Siegel wrote: > H

SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Hello, I have previously successfully run SparkR in RStudio, with: >Sys.setenv(SPARK_HOME="~/software/spark-1.4.1-bin-hadoop2.4") >.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) >library(SparkR) >sc <- sparkR.init(master="local[2]",appName="SparkR-example") Then I tr