Re: SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Thanks Shivaram. You got me wondering about the path so I put it in full and it worked. R does not, of course, expand a ~. On Thu, Aug 20, 2015 at 4:35 PM, Shivaram Venkataraman shiva...@eecs.berkeley.edu wrote: Can you check if the file `~/software/spark-1.4.1-bin-hadoop2.4/bin/spark-submit`

SparkR - can't create spark context - JVM not ready

2015-08-20 Thread Deborah Siegel
Hello, I have previously successfully run SparkR in RStudio, with: Sys.setenv(SPARK_HOME=~/software/spark-1.4.1-bin-hadoop2.4) .libPaths(c(file.path(Sys.getenv(SPARK_HOME), R, lib), .libPaths())) library(SparkR) sc - sparkR.init(master=local[2],appName=SparkR-example) Then I tried putting some