Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/13296#discussion_r64742562 --- Diff: docs/running-on-yarn.md --- @@ -60,6 +60,8 @@ Running Spark on YARN requires a binary distribution of Spark which is built wit Binary distributions can be downloaded from the [downloads page](http://spark.apache.org/downloads.html) of the project website. To build Spark yourself, refer to [Building Spark](building-spark.html). +To make Spark runtime jars accessible from YARN side, basically you could specify through `spark.yarn.archive` or `spark.yarn.jars`, for the details please refer to [Spark Properties](running-on-yarn.html#spark-properties). If neither `spark.yarn.archive` nor `spark.yarn.jars` is specified, Spark will fall back to zip all the jars under `$SPARK_HOME/jars` and upload to distributed cache. --- End diff -- This needs a little bit of editing: "basically you could specify through" -> "you can specify" ", for the details" -> ". For details," "to distributed cache" -> "to the distributed cache"
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org