Repository: spark
Updated Branches:
  refs/heads/master b264cbb16 -> 209e1b3c0


[SPARKR][MINOR] Fix Cache Folder Path in Windows

## What changes were proposed in this pull request?

This PR tries to fix the scheme of local cache folder in Windows. The name of 
the environment variable should be `LOCALAPPDATA` rather than `%LOCALAPPDATA%`.

## How was this patch tested?

Manual test in Windows 7.

Author: Junyang Qian <junya...@databricks.com>

Closes #14743 from junyangq/SPARKR-FixWindowsInstall.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/209e1b3c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/209e1b3c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/209e1b3c

Branch: refs/heads/master
Commit: 209e1b3c0683a9106428e269e5041980b6cc327f
Parents: b264cbb
Author: Junyang Qian <junya...@databricks.com>
Authored: Mon Aug 22 10:03:48 2016 -0700
Committer: Shivaram Venkataraman <shiva...@cs.berkeley.edu>
Committed: Mon Aug 22 10:03:48 2016 -0700

----------------------------------------------------------------------
 R/pkg/R/install.R | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/209e1b3c/R/pkg/R/install.R
----------------------------------------------------------------------
diff --git a/R/pkg/R/install.R b/R/pkg/R/install.R
index 987bac7..ff81e86 100644
--- a/R/pkg/R/install.R
+++ b/R/pkg/R/install.R
@@ -212,7 +212,7 @@ hadoop_version_name <- function(hadoopVersion) {
 # adapt to Spark context
 spark_cache_path <- function() {
   if (.Platform$OS.type == "windows") {
-    winAppPath <- Sys.getenv("%LOCALAPPDATA%", unset = NA)
+    winAppPath <- Sys.getenv("LOCALAPPDATA", unset = NA)
     if (is.na(winAppPath)) {
       msg <- paste("%LOCALAPPDATA% not found.",
                    "Please define the environment variable",


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to