[ 
https://issues.apache.org/jira/browse/HIVE-15269?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15833014#comment-15833014
 ] 

Hive QA commented on HIVE-15269:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12848654/HIVE-15269.16.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/3102/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/3102/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-3102/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2017-01-21 14:52:26.368
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-3102/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2017-01-21 14:52:26.370
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at d9343f6 HIVE-15544 : Support scalar subqueries (Vineet Garg via 
Ashutosh Chauhan)
+ git clean -f -d
Removing 
metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStore.java.orig
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at d9343f6 HIVE-15544 : Support scalar subqueries (Vineet Garg via 
Ashutosh Chauhan)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2017-01-21 14:52:27.284
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java: No such 
file or directory
error: a/itests/src/test/resources/testconfiguration.properties: No such file 
or directory
error: a/orc/src/test/org/apache/orc/impl/TestRecordReaderImpl.java: No such 
file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractMapJoinOperator.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/CommonJoinOperator.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeColumnEvaluator.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeConstantDefaultEvaluator.java:
 No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeConstantEvaluator.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeEvaluator.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeEvaluatorFactory.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeEvaluatorHead.java: 
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeEvaluatorRef.java: 
No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeFieldEvaluator.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/ExprNodeGenericFuncEvaluator.java: 
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FilterOperator.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/HashTableSinkOperator.java: 
No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/JoinUtil.java: No such file 
or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ObjectCache.java: No such 
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ObjectCacheWrapper.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/SelectOperator.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ObjectCache.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/LlapObjectCache.java: 
No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/MapRecordProcessor.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/ObjectCache.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/ReduceRecordProcessor.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorMapJoinOperator.java: 
No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorSMBMapJoinOperator.java:
 No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/io/sarg/ConvertAstToSearchArg.java: No 
such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/DynamicPartitionPruningOptimization.java:
 No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/FixedBucketPruningOptimizer.java:
 No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/RedundantDynamicPruningConditionsRemoval.java:
 No such file or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/optimizer/stats/annotation/StatsRulesProcFactory.java:
 No such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/GenTezUtils.java: No such 
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/ParseContext.java: No such 
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/TaskCompiler.java: No such 
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/TezCompiler.java: No such 
file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/AggregationDesc.java: No 
such file or directory
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/BaseWork.java: No such file 
or directory
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFEvaluator.java: 
No such file or directory
error: 
a/ql/src/test/org/apache/hadoop/hive/ql/io/sarg/TestConvertAstToSearchArg.java: 
No such file or directory
error: 
a/ql/src/test/org/apache/hadoop/hive/ql/io/sarg/TestSearchArgumentImpl.java: No 
such file or directory
error: 
a/ql/src/test/org/apache/hadoop/hive/ql/optimizer/physical/TestVectorizer.java: 
No such file or directory
error: 
a/ql/src/test/results/clientpositive/llap/dynamic_partition_pruning.q.out: No 
such file or directory
error: a/ql/src/test/results/clientpositive/llap/join32_lessSize.q.out: No such 
file or directory
error: a/ql/src/test/results/clientpositive/llap/llap_partitioned.q.out: No 
such file or directory
error: a/ql/src/test/results/clientpositive/llap/mergejoin.q.out: No such file 
or directory
error: a/ql/src/test/results/clientpositive/llap/orc_llap.q.out: No such file 
or directory
error: 
a/ql/src/test/results/clientpositive/llap/vectorized_dynamic_partition_pruning.q.out:
 No such file or directory
error: a/ql/src/test/results/clientpositive/perf/query16.q.out: No such file or 
directory
error: a/ql/src/test/results/clientpositive/perf/query83.q.out: No such file or 
directory
error: a/ql/src/test/results/clientpositive/show_functions.q.out: No such file 
or directory
error: a/ql/src/test/results/clientpositive/tez/explainanalyze_3.q.out: No such 
file or directory
error: a/ql/src/test/results/clientpositive/tez/explainuser_3.q.out: No such 
file or directory
error: 
a/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchArgumentFactory.java:
 No such file or directory
error: 
a/storage-api/src/java/org/apache/hadoop/hive/ql/io/sarg/SearchArgumentImpl.java:
 No such file or directory
error: a/storage-api/src/java/org/apache/hive/common/util/BloomFilter.java: No 
such file or directory
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12848654 - PreCommit-HIVE-Build

> Dynamic Min-Max runtime-filtering for Tez
> -----------------------------------------
>
>                 Key: HIVE-15269
>                 URL: https://issues.apache.org/jira/browse/HIVE-15269
>             Project: Hive
>          Issue Type: New Feature
>            Reporter: Jason Dere
>            Assignee: Deepak Jaiswal
>         Attachments: HIVE-15269.10.patch, HIVE-15269.11.patch, 
> HIVE-15269.12.patch, HIVE-15269.13.patch, HIVE-15269.14.patch, 
> HIVE-15269.15.patch, HIVE-15269.16.patch, HIVE-15269.1.patch, 
> HIVE-15269.2.patch, HIVE-15269.3.patch, HIVE-15269.4.patch, 
> HIVE-15269.5.patch, HIVE-15269.6.patch, HIVE-15269.7.patch, 
> HIVE-15269.8.patch, HIVE-15269.9.patch
>
>
> If a dimension table and fact table are joined:
> {noformat}
> select *
> from store join store_sales on (store.id = store_sales.store_id)
> where store.s_store_name = 'My Store'
> {noformat}
> One optimization that can be done is to get the min/max store id values that 
> come out of the scan/filter of the store table, and send this min/max value 
> (via Tez edge) to the task which is scanning the store_sales table.
> We can add a BETWEEN(min, max) predicate to the store_sales TableScan, where 
> this predicate can be pushed down to the storage handler (for example for ORC 
> formats). Pushing a min/max predicate to the ORC reader would allow us to 
> avoid having to entire whole row groups during the table scan.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to