[ 
https://issues.apache.org/jira/browse/HIVE-8204?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14148813#comment-14148813
 ] 

Hive QA commented on HIVE-8204:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12670270/HIVE-8204.1.patch

Test results: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/989/testReport
Console output: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/989/console
Test logs: 
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-989/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ [[ -n /usr/java/jdk1.7.0_45-cloudera ]]
+ export JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ export 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-maven-3.0.5/bin:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ 
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-maven-3.0.5/bin:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-TRUNK-Build-989/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'ql/src/test/results/compiler/plan/input2.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input3.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input6.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input7.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input9.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample2.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample3.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample4.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample5.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample6.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample7.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input1.q.xml'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMapRedUtils.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/PartitionDesc.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/TaskCompiler.java'
++ egrep -v '^X|^Performing status on external'
++ awk '{print $2}'
++ svn status --no-ignore
+ rm -rf target datanucleus.log ant/target shims/target shims/0.20/target 
shims/0.20S/target shims/0.23/target shims/aggregator/target 
shims/common/target shims/common-secure/target packaging/target 
hbase-handler/target testutils/target jdbc/target metastore/target 
itests/target itests/hcatalog-unit/target itests/test-serde/target 
itests/qtest/target itests/hive-unit-hadoop2/target itests/hive-minikdc/target 
itests/hive-unit/target itests/custom-serde/target itests/util/target 
hcatalog/target hcatalog/core/target hcatalog/streaming/target 
hcatalog/server-extensions/target hcatalog/hcatalog-pig-adapter/target 
hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target 
accumulo-handler/target hwi/target common/target common/src/gen contrib/target 
service/target serde/target beeline/target odbc/target cli/target 
ql/dependency-reduced-pom.xml ql/target 
ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMapRedUtils.java.orig
+ svn update

Fetching external item into 'hcatalog/src/test/e2e/harness'
External at revision 1627716.

At revision 1627716.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12670270

> Dynamic partition pruning fails with IndexOutOfBoundsException
> --------------------------------------------------------------
>
>                 Key: HIVE-8204
>                 URL: https://issues.apache.org/jira/browse/HIVE-8204
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 0.14.0
>            Reporter: Prasanth J
>            Assignee: Gunther Hagleitner
>         Attachments: HIVE-8204.1.patch
>
>
> Dynamic partition pruning fails with IndexOutOfBounds exception when 
> dimension table is partitioned and fact table is not.
> Steps to reproduce:
> 1) Partition date_dim table from tpcds on d_date_sk
> 2) Fact table is store_sales which is not partitioned
> 3) Run the following
> {code}
> set hive.stats.fetch.column.stats=ture;
> set hive.tez.dynamic.partition.pruning=true;
> explain select d_date 
> from store_sales, date_dim 
> where 
> store_sales.ss_sold_date_sk = date_dim.d_date_sk and 
> date_dim.d_year = 1998;
> {code}
> The stack trace is:
> {code}
> 2014-09-19 19:06:16,254 ERROR ql.Driver (SessionState.java:printError(825)) - 
> FAILED: IndexOutOfBoundsException Index: 0, Size: 0
> java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
>       at java.util.ArrayList.rangeCheck(ArrayList.java:635)
>       at java.util.ArrayList.get(ArrayList.java:411)
>       at 
> org.apache.hadoop.hive.ql.optimizer.RemoveDynamicPruningBySize.process(RemoveDynamicPruningBySize.java:61)
>       at 
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
>       at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
>       at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
>       at 
> org.apache.hadoop.hive.ql.lib.ForwardWalker.walk(ForwardWalker.java:61)
>       at 
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
>       at 
> org.apache.hadoop.hive.ql.parse.TezCompiler.runStatsDependentOptimizations(TezCompiler.java:277)
>       at 
> org.apache.hadoop.hive.ql.parse.TezCompiler.optimizeOperatorPlan(TezCompiler.java:120)
>       at 
> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:97)
>       at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9781)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:221)
>       at 
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:74)
>       at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:221)
>       at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:407)
>       at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:303)
>       at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1060)
>       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1130)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:997)
>       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:987)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:246)
>       at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:198)
>       at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:408)
>       at 
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
>       at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>       at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to