Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/15888
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r8285
--- Diff: R/pkg/inst/tests/testthat/test_sparkR.R ---
@@ -0,0 +1,36 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88808464
--- Diff: R/pkg/R/sparkR.R ---
@@ -373,8 +373,13 @@ sparkR.session <- function(
overrideEnvs(sparkConfigMap, paramMap)
}
+
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88808475
--- Diff: R/pkg/R/sparkR.R ---
@@ -550,24 +555,27 @@ processSparkPackages <- function(packages) {
#
# @param sparkHome directory to find Spark
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88763342
--- Diff: R/pkg/R/sparkR.R ---
@@ -373,8 +373,13 @@ sparkR.session <- function(
overrideEnvs(sparkConfigMap, paramMap)
}
+
Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88214305
--- Diff: R/pkg/R/sparkR.R ---
@@ -550,24 +566,28 @@ processSparkPackages <- function(packages) {
#
# @param sparkHome directory to find Spark
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88127026
--- Diff: R/pkg/R/sparkR.R ---
@@ -558,16 +558,18 @@ sparkCheckInstall <- function(sparkHome, master) {
message(msg)
NULL
Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88032672
--- Diff: R/pkg/R/sparkR.R ---
@@ -558,16 +558,18 @@ sparkCheckInstall <- function(sparkHome, master) {
message(msg)
NULL
}
Github user yanboliang commented on a diff in the pull request:
https://github.com/apache/spark/pull/15888#discussion_r88032786
--- Diff: R/pkg/R/sparkR.R ---
@@ -558,16 +558,18 @@ sparkCheckInstall <- function(sparkHome, master) {
message(msg)
NULL
}
GitHub user yanboliang opened a pull request:
https://github.com/apache/spark/pull/15888
[SPARK-18444][SPARKR] SparkR running in yarn-cluster mode should not
download Spark package.
## What changes were proposed in this pull request?
When running SparkR job in yarn-cluster
10 matches
Mail list logo