I have installed the SparkR package from Spark distribution into the R
library. I can call the following command and it seems to work properly:
library(SparkR)

However, when I try to get the Spark context using the following code,

sc <- sparkR.init(master="local")
It fails after some time with the following message:

Error in sparkR.init(master = "local") :
   JVM is not ready after 10 seconds
I have set JAVA_HOME, and I have a working RStudio where I can access other
packages like ggplot2. I don't know why it is not working, and I don't even
know where to investigate the issue.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to