Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/13217#discussion_r64022461 --- Diff: R/WINDOWS.md --- @@ -11,3 +11,19 @@ include Rtools and R in `PATH`. directory in Maven in `PATH`. 4. Set `MAVEN_OPTS` as described in [Building Spark](http://spark.apache.org/docs/latest/building-spark.html). 5. Open a command shell (`cmd`) in the Spark directory and run `mvn -DskipTests -Psparkr package` + +## Unit tests + +To run existing unit tests in SparkR on Windows, the following setps are required (the steps below suppose you are in Spark root directory) + +1. Set `HADOOP_HOME`. +2. Download `winutils.exe` and locate this in `$HADOOP_HOME/bin`. + + It seems not requiring installing Hadoop but only this `winutils.exe`. It seems not included in Hadoop official binary releases so it should be built from source but it seems it is able to be downloaded from community (e.g. [steveloughran/winutils](https://github.com/steveloughran/winutils)). + +3. Run unit-tests for SparkR by running below (you need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first): --- End diff -- "unit tests" and "by running the command below". Again parenthetical can be a sentence. This step is already documented in R docs though.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org