[ 
https://issues.apache.org/jira/browse/HIVE-20056?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16551562#comment-16551562
 ] 

Hive QA commented on HIVE-20056:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12932472/HIVE-20056.2.patch

{color:red}ERROR:{color} -1 due to no test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 14680 tests 
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druid_timestamptz]
 (batchId=193)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_joins]
 (batchId=193)
org.apache.hadoop.hive.cli.TestMiniDruidCliDriver.testCliDriver[druidmini_masking]
 (batchId=193)
{noformat}

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/12755/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/12755/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-12755/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12932472 - PreCommit-HIVE-Build

> SparkPartitionPruner shouldn't be triggered by Spark tasks
> ----------------------------------------------------------
>
>                 Key: HIVE-20056
>                 URL: https://issues.apache.org/jira/browse/HIVE-20056
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-20056.1.patch, HIVE-20056.2.patch
>
>
> It looks like {{SparkDynamicPartitionPruner}} is being called by every Spark 
> task because it gets created whenever {{getRecordReader}} is called on the 
> associated {{InputFormat}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to