GitHub user shivaram opened a pull request: https://github.com/apache/spark/pull/14173
[SPARKR][SPARK-16507] Add a CRAN checker, fix Rd aliases ## What changes were proposed in this pull request? Add a check-cran.sh script that runs `R CMD check` as CRAN. Also fixes a number of issues pointed out by the check. These include - Updating `DESCRIPTION` to be appropriate - Adding a .Rbuildignore to ignore lintr, src-native, html that are non-standard files / dirs - Adding aliases to all S4 methods in DataFrame, Column, GroupedData etc. This is required as stated in https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Documenting-S4-classes-and-methods - Other minor fixes ## How was this patch tested? SparkR unit tests, running the above mentioned script You can merge this pull request into a Git repository by running: $ git pull https://github.com/shivaram/spark-1 sparkr-cran-changes Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/14173.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #14173 ---- commit be7a74fe4f25c0f6123d52b6698d0f354c3a5978 Author: Shivaram Venkataraman <shiva...@cs.berkeley.edu> Date: 2016-07-12T01:41:35Z Add a CRAN checker. Also update Rd documentation to address a number of CRAN check warnings. commit 62be1cbe9464acb5dbcb62f2c54fe906488ded6a Author: Shivaram Venkataraman <shiva...@cs.berkeley.edu> Date: 2016-07-13T00:11:13Z Use x instead of col in window functions ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org