Github user andrewor14 commented on a diff in the pull request: https://github.com/apache/spark/pull/6743#discussion_r33537183 --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala --- @@ -339,6 +340,24 @@ object SparkSubmit { } } + // In yarn mode for an R app, add the SparkR package archive to archives + // that can be distributed with the job + if (args.isR && clusterManager == YARN) { + val sparkHome = sys.env.get("SPARK_HOME") + if (sparkHome.isEmpty) { + printErrorAndExit("SPARK_HOME does not exist for R application in yarn mode.") + } + val rPackagePath = Seq(sparkHome.get, "R", "lib").mkString(File.separator) --- End diff -- Is there a way to make this also go through `RUtils.sparkRPackagePath`? It seems weird to have two separate code paths here.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org