Hello Rui Sun,

Thanks for your reply.
On reading the file "readme.md" in the section "Using SparkR from RStudio"
it mentions to set the .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R",
"lib"), .libPaths()))

Please tell me how I can set this in Windows environment? What I mean is
how to setup .libPaths()? where is it in windows environment
Thanks for your help

Sincerely,
Ashish Dutt


On Mon, Jul 13, 2015 at 3:48 PM, Sun, Rui <rui....@intel.com> wrote:

> Hi, Kachau,
>
> If you are using SparkR with RStudio, have you followed the guidelines in
> the section "Using SparkR from RStudio" in
> https://github.com/apache/spark/tree/master/R ?
>
> ________________________________________
> From: kachau [umesh.ka...@gmail.com]
> Sent: Saturday, July 11, 2015 12:30 AM
> To: user@spark.apache.org
> Subject: SparkR Error in sparkR.init(master=“local”) in RStudio
>
> I have installed the SparkR package from Spark distribution into the R
> library. I can call the following command and it seems to work properly:
> library(SparkR)
>
> However, when I try to get the Spark context using the following code,
>
> sc <- sparkR.init(master="local")
> It fails after some time with the following message:
>
> Error in sparkR.init(master = "local") :
>    JVM is not ready after 10 seconds
> I have set JAVA_HOME, and I have a working RStudio where I can access other
> packages like ggplot2. I don't know why it is not working, and I don't even
> know where to investigate the issue.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-Error-in-sparkR-init-master-local-in-RStudio-tp23768.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to