It looks like they would not allow caching the Spark Distribution. I’m not sure what can be done about this.
If I recall, the package should remove this during test. Or maybe spark.install() ie optional (hence getting user confirmation?) ---------- Forwarded message --------- Date: Sun, Jun 13, 2021 at 10:19 PM Subject: CRAN package SparkR To: Felix Cheung <felixche...@apache.org> CC: <c...@r-project.org> Dear maintainer, Checking this apparently creates the default directory as per #' @param localDir a local directory where Spark is installed. The directory con tains #' version-specific folders of Spark packages. Default is path t o #' the cache directory: #' \itemize{ #' \item Mac OS X: \file{~/Library/Caches/spark} #' \item Unix: \env{$XDG_CACHE_HOME} if defined, otherwise \file{~/.cache/spark} #' \item Windows: \file{\%LOCALAPPDATA\%\\Apache\\Spark\\Cache}. #' } However, the CRAN Policy says - Packages should not write in the user’s home filespace (including clipboards), nor anywhere else on the file system apart from the R session’s temporary directory (or during installation in the location pointed to by TMPDIR: and such usage should be cleaned up). Installing into the system’s R installation (e.g., scripts to its bin directory) is not allowed. Limited exceptions may be allowed in interactive sessions if the package obtains confirmation from the user. For R version 4.0 or later (hence a version dependency is required or only conditional use is possible), packages may store user-specific data, configuration and cache files in their respective user directories obtained from tools::R_user_dir(), provided that by default sizes are kept as small as possible and the contents are actively managed (including removing outdated material). Can you pls fix as necessary? Please fix before 2021-06-28 to safely retain your package on CRAN. Best -k