[ 
https://issues.apache.org/jira/browse/HIVE-16799?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16032611#comment-16032611
 ] 

Hive QA commented on HIVE-16799:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12870741/HIVE-16799.patch

{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 10814 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestBeeLineDriver.testCliDriver[insert_overwrite_local_directory_1]
 (batchId=237)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[columnstats_part_coltype]
 (batchId=157)
org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver[vector_if_expr]
 (batchId=145)
org.apache.hadoop.hive.cli.TestPerfCliDriver.testCliDriver[query14] 
(batchId=232)
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/5496/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/5496/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-5496/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 4 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12870741 - PreCommit-HIVE-Build

> Control the max number of task for a stage in a spark job
> ---------------------------------------------------------
>
>                 Key: HIVE-16799
>                 URL: https://issues.apache.org/jira/browse/HIVE-16799
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Xuefu Zhang
>            Assignee: Xuefu Zhang
>         Attachments: HIVE-16799.patch
>
>
> HIVE-16552 gives admin an option to control the maximum number of tasks a 
> Spark job may have. However, this may not be sufficient as this tends to 
> penalize jobs that have many stages while favoring jobs that has fewer 
> stages. Ideally, we should also limit the number of tasks in a stage, which 
> is closer to the maximum number of mappers or reducers in a MR job.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to