[ https://issues.apache.org/jira/browse/SPARK-22011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-22011. ------------------------------- Resolution: Not A Problem The error pretty much says it all -- set SPARK_HOME. See the SparkR docs. > model <- spark.logit(training, Survived ~ ., regParam = 0.5) shwoing error > -------------------------------------------------------------------------- > > Key: SPARK-22011 > URL: https://issues.apache.org/jira/browse/SPARK-22011 > Project: Spark > Issue Type: Bug > Components: Examples > Affects Versions: 2.2.0 > Environment: Error showing on SparkR > Reporter: Atul Khairnar > > Sys.setenv(SPARK_HOME="C:/spark") > .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths())) > Sys.setenv(JAVA_HOME="C:/Program Files/Java/jdk1.8.0_144/") > library(SparkR) > sc <- sparkR.session(master = "local") > sqlContext <- sparkRSQL.init(sc) > o/p: showing error in Rstudio > Warning message: > 'sparkRSQL.init' is deprecated. > Use 'sparkR.session' instead. > See help("Deprecated") > Can you help me what exactly error/warning...and next > model <- spark.logit(training, Survived ~ ., regParam = 0.5) > Error in handleErrors(returnStatus, conn) : > org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 > in stage 35.0 failed 1 times, most recent failure: Lost task 0.0 in stage > 35.0 (TID 31, localhost, executor driver): org.apache.spark.SparkException: > SPARK_HOME not set. Can't locate SparkR package. > at org.apache.spark.api.r.RUtils$$anonfun$2.apply(RUtils.scala:88) > at org.apache.spark.api.r.RUtils$$anonfun$2.apply(RUtils.scala:88) > at scala.Option.getOrElse(Option.scala:121) > at org.apache.spark.api.r.RUtils$.sparkRPackagePath(RUtils.scala:87) > at org.apache.spark.api.r.RRunner$.createRProcess(RRunner.scala:339) > at org.apache.spark.api.r.RRunner$.createRWorker(RRunner.scala:391) > at org.apache.spark.api.r.RRunner.compute(RRunner.scala:69) > at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:51) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) > at > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38 -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org