spark git commit: [SPARK-16339][CORE] ScriptTransform does not print stderr when outstream is lost

2016-07-06 Thread srowen
Repository: spark
Updated Branches:
  refs/heads/branch-2.0 6e8fa86eb -> 521fc7186


[SPARK-16339][CORE] ScriptTransform does not print stderr when outstream is lost

## What changes were proposed in this pull request?

Currently, if due to some failure, the outstream gets destroyed or closed and 
later `outstream.close()` leads to IOException in such case. Due to this, the 
`stderrBuffer` does not get logged and there is no way for users to see why the 
job failed.

The change is to first display the stderr buffer and then try closing the 
outstream.

## How was this patch tested?

The correct way to test this fix would be to grep the log to see if the 
`stderrBuffer` gets logged but I dont think having test cases which do that is 
a good idea.

(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

…

Author: Tejas Patil 

Closes #13834 from tejasapatil/script_transform.

(cherry picked from commit 5f342049cce9102fb62b4de2d8d8fa691c2e8ac4)
Signed-off-by: Sean Owen 


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/521fc718
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/521fc718
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/521fc718

Branch: refs/heads/branch-2.0
Commit: 521fc7186a2637321f7a7cfac713537de73ae66f
Parents: 6e8fa86
Author: Tejas Patil 
Authored: Wed Jul 6 09:18:04 2016 +0100
Committer: Sean Owen 
Committed: Wed Jul 6 09:18:13 2016 +0100

--
 .../spark/sql/hive/execution/ScriptTransformation.scala  | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/521fc718/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
--
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
index 9e25e1d..dfb1251 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
@@ -312,15 +312,15 @@ private class ScriptTransformationWriterThread(
   }
   threwException = false
 } catch {
-  case NonFatal(e) =>
+  case t: Throwable =>
 // An error occurred while writing input, so kill the child process. 
According to the
 // Javadoc this call will not throw an exception:
-_exception = e
+_exception = t
 proc.destroy()
-throw e
+throw t
 } finally {
   try {
-outputStream.close()
+Utils.tryLogNonFatalError(outputStream.close())
 if (proc.waitFor() != 0) {
   logError(stderrBuffer.toString) // log the stderr circular buffer
 }


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-16339][CORE] ScriptTransform does not print stderr when outstream is lost

2016-07-06 Thread srowen
Repository: spark
Updated Branches:
  refs/heads/master ec79183ac -> 5f342049c


[SPARK-16339][CORE] ScriptTransform does not print stderr when outstream is lost

## What changes were proposed in this pull request?

Currently, if due to some failure, the outstream gets destroyed or closed and 
later `outstream.close()` leads to IOException in such case. Due to this, the 
`stderrBuffer` does not get logged and there is no way for users to see why the 
job failed.

The change is to first display the stderr buffer and then try closing the 
outstream.

## How was this patch tested?

The correct way to test this fix would be to grep the log to see if the 
`stderrBuffer` gets logged but I dont think having test cases which do that is 
a good idea.

(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

…

Author: Tejas Patil 

Closes #13834 from tejasapatil/script_transform.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5f342049
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5f342049
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5f342049

Branch: refs/heads/master
Commit: 5f342049cce9102fb62b4de2d8d8fa691c2e8ac4
Parents: ec79183
Author: Tejas Patil 
Authored: Wed Jul 6 09:18:04 2016 +0100
Committer: Sean Owen 
Committed: Wed Jul 6 09:18:04 2016 +0100

--
 .../spark/sql/hive/execution/ScriptTransformation.scala  | 8 
 1 file changed, 4 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/5f342049/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
--
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
index 84990d3..d063dd6 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
@@ -314,15 +314,15 @@ private class ScriptTransformationWriterThread(
   }
   threwException = false
 } catch {
-  case NonFatal(e) =>
+  case t: Throwable =>
 // An error occurred while writing input, so kill the child process. 
According to the
 // Javadoc this call will not throw an exception:
-_exception = e
+_exception = t
 proc.destroy()
-throw e
+throw t
 } finally {
   try {
-outputStream.close()
+Utils.tryLogNonFatalError(outputStream.close())
 if (proc.waitFor() != 0) {
   logError(stderrBuffer.toString) // log the stderr circular buffer
 }


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org