Repository: spark
Updated Branches:
  refs/heads/master 385d73848 -> a7ab6f9a8


[SPARK-19324][SPARKR] Spark VJM stdout output is getting dropped in SparkR

## What changes were proposed in this pull request?

This affects mostly running job from the driver in client mode when results are 
expected to be through stdout (which should be somewhat rare, but possible)

Before:
```
> a <- as.DataFrame(cars)
> b <- group_by(a, "dist")
> c <- count(b)
> sparkR.callJMethod(c$countjc, "explain", TRUE)
NULL
```

After:
```
> a <- as.DataFrame(cars)
> b <- group_by(a, "dist")
> c <- count(b)
> sparkR.callJMethod(c$countjc, "explain", TRUE)
count#11L
NULL
```

Now, `column.explain()` doesn't seem very useful (we can get more extensive 
output with `DataFrame.explain()`) but there are other more complex examples 
with calls of `println` in Scala/JVM side, that are getting dropped.

## How was this patch tested?

manual

Author: Felix Cheung <felixcheun...@hotmail.com>

Closes #16670 from felixcheung/rjvmstdout.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a7ab6f9a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a7ab6f9a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a7ab6f9a

Branch: refs/heads/master
Commit: a7ab6f9a8fdfb927f0bcefdc87a92cc82fac4223
Parents: 385d738
Author: Felix Cheung <felixcheun...@hotmail.com>
Authored: Fri Jan 27 12:41:35 2017 -0800
Committer: Shivaram Venkataraman <shiva...@cs.berkeley.edu>
Committed: Fri Jan 27 12:41:35 2017 -0800

----------------------------------------------------------------------
 R/pkg/R/utils.R                          | 11 ++++++++---
 R/pkg/inst/tests/testthat/test_Windows.R |  2 +-
 2 files changed, 9 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/a7ab6f9a/R/pkg/R/utils.R
----------------------------------------------------------------------
diff --git a/R/pkg/R/utils.R b/R/pkg/R/utils.R
index 74b3e50..1f7848f 100644
--- a/R/pkg/R/utils.R
+++ b/R/pkg/R/utils.R
@@ -756,12 +756,17 @@ varargsToJProperties <- function(...) {
   props
 }
 
-launchScript <- function(script, combinedArgs, capture = FALSE) {
+launchScript <- function(script, combinedArgs, wait = FALSE) {
   if (.Platform$OS.type == "windows") {
     scriptWithArgs <- paste(script, combinedArgs, sep = " ")
-    shell(scriptWithArgs, translate = TRUE, wait = capture, intern = capture) 
# nolint
+    # on Windows, intern = F seems to mean output to the console. 
(documentation on this is missing)
+    shell(scriptWithArgs, translate = TRUE, wait = wait, intern = wait) # 
nolint
   } else {
-    system2(script, combinedArgs, wait = capture, stdout = capture)
+    # http://stat.ethz.ch/R-manual/R-devel/library/base/html/system2.html
+    # stdout = F means discard output
+    # stdout = "" means to its console (default)
+    # Note that the console of this child process might not be the same as the 
running R process.
+    system2(script, combinedArgs, stdout = "", wait = wait)
   }
 }
 

http://git-wip-us.apache.org/repos/asf/spark/blob/a7ab6f9a/R/pkg/inst/tests/testthat/test_Windows.R
----------------------------------------------------------------------
diff --git a/R/pkg/inst/tests/testthat/test_Windows.R 
b/R/pkg/inst/tests/testthat/test_Windows.R
index 8813e18..e8d9834 100644
--- a/R/pkg/inst/tests/testthat/test_Windows.R
+++ b/R/pkg/inst/tests/testthat/test_Windows.R
@@ -20,7 +20,7 @@ test_that("sparkJars tag in SparkContext", {
   if (.Platform$OS.type != "windows") {
     skip("This test is only for Windows, skipped")
   }
-  testOutput <- launchScript("ECHO", "a/b/c", capture = TRUE)
+  testOutput <- launchScript("ECHO", "a/b/c", wait = TRUE)
   abcPath <- testOutput[1]
   expect_equal(abcPath, "a\\b\\c")
 })


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to