[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Sean Owen updated SPARK-4704: - Labels: (was: backport-needed) > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman >Assignee: Sean Owen > Fix For: 1.2.2, 1.4.0, 1.3.1 > > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4704: - Assignee: Sean Owen > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman >Assignee: Sean Owen > Labels: backport-needed > Fix For: 1.2.2, 1.4.0 > > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4704: - Target Version/s: 1.2.2, 1.4.0, 1.3.1 > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4704: - Labels: backport-needed (was: ) > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman >Assignee: Sean Owen > Labels: backport-needed > Fix For: 1.2.2, 1.4.0 > > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4704: - Fix Version/s: 1.4.0 1.2.2 > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman > Labels: backport-needed > Fix For: 1.2.2, 1.4.0 > > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-4704) SparkSubmitDriverBootstrap doesn't flush output
[ https://issues.apache.org/jira/browse/SPARK-4704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrew Or updated SPARK-4704: - Affects Version/s: 1.2.0 > SparkSubmitDriverBootstrap doesn't flush output > --- > > Key: SPARK-4704 > URL: https://issues.apache.org/jira/browse/SPARK-4704 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 1.2.0 > Environment: 1.2.0-rc1 >Reporter: Stephen Haberman > > When running spark-submit with a job that immediately blows up (say due to > init errors in the job code), there is no error output from spark-submit on > the console. > When I ran spark-class directly, then I do see the error/stack trace on the > console. > I believe the issue is in SparkSubmitDriverBootstrapper (I had > spark.driver.memory set in spark-defaults.conf) not waiting for the > RedirectThreads to flush/complete before exiting. > E.g. here: > https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala#L143 > I believe around line 165 or so, stdoutThread.join() and > stderrThread.join() calls are necessary to make sure the other threads > have had a chance to flush process.getInputStream/getErrorStream to > System.out/err before the process exits. > I've been tripped up by this in similar RedirectThread/process code, hence > suspecting this is the problem. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org