[ 
https://issues.apache.org/jira/browse/HIVE-15237?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15688957#comment-15688957
 ] 

Hive QA commented on HIVE-15237:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12840195/HIVE-15237.2.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 10732 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver[transform_ppr2] 
(batchId=133)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[join_acid_non_acid]
 (batchId=150)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[union_fast_stats]
 (batchId=145)
org.apache.hive.spark.client.TestSparkClient.testJobSubmission (batchId=272)
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/2253/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/2253/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-2253/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 4 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12840195 - PreCommit-HIVE-Build

> Propagate Spark job failure to Hive
> -----------------------------------
>
>                 Key: HIVE-15237
>                 URL: https://issues.apache.org/jira/browse/HIVE-15237
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 2.1.0
>            Reporter: Xuefu Zhang
>            Assignee: Rui Li
>         Attachments: HIVE-15237.2.patch, HIVE-15237.2.patch, HIVE-15237.patch
>
>
> If a Spark job failed for some reason, Hive doesn't get any additional error 
> message, which makes it very hard for user to figure out why. Here is an 
> example:
> {code}
> Status: Running (Hive on Spark job[0])
> Job Progress Format
> CurrentTime StageId_StageAttemptId: 
> SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount 
> [StageCost]
> 2016-11-17 21:32:53,134       Stage-0_0: 0/23 Stage-1_0: 0/28 
> 2016-11-17 21:32:55,156       Stage-0_0: 0(+1)/23     Stage-1_0: 0/28 
> 2016-11-17 21:32:57,167       Stage-0_0: 0(+3)/23     Stage-1_0: 0/28 
> 2016-11-17 21:33:00,216       Stage-0_0: 0(+3)/23     Stage-1_0: 0/28 
> 2016-11-17 21:33:03,251       Stage-0_0: 0(+3)/23     Stage-1_0: 0/28 
> 2016-11-17 21:33:06,286       Stage-0_0: 0(+4)/23     Stage-1_0: 0/28 
> 2016-11-17 21:33:09,308       Stage-0_0: 0(+2,-3)/23  Stage-1_0: 0/28 
> 2016-11-17 21:33:12,332       Stage-0_0: 0(+2,-3)/23  Stage-1_0: 0/28 
> 2016-11-17 21:33:13,338       Stage-0_0: 0(+21,-3)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:15,349       Stage-0_0: 0(+21,-5)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:16,358       Stage-0_0: 0(+18,-8)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:19,373       Stage-0_0: 0(+21,-8)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:22,400       Stage-0_0: 0(+18,-14)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:23,404       Stage-0_0: 0(+15,-20)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:24,408       Stage-0_0: 0(+12,-23)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:25,417       Stage-0_0: 0(+9,-26)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:26,420       Stage-0_0: 0(+12,-26)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:28,427       Stage-0_0: 0(+9,-29)/23 Stage-1_0: 0/28 
> 2016-11-17 21:33:29,432       Stage-0_0: 0(+12,-29)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:31,444       Stage-0_0: 0(+18,-29)/23        Stage-1_0: 0/28 
> 2016-11-17 21:33:34,464       Stage-0_0: 0(+18,-29)/23        Stage-1_0: 0/28 
> Status: Failed
> FAILED: Execution Error, return code 3 from 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask
> {code}
> It would be better if we can propagate Spark error to Hive.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to