[ 
https://issues.apache.org/jira/browse/HIVE-21109?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16798789#comment-16798789
 ] 

Hive QA commented on HIVE-21109:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12963371/HIVE-21109.03.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/16628/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/16628/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-16628/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-03-22 07:17:09.138
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-16628/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-03-22 07:17:09.140
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
   2b181dc..4c87512  master     -> origin/master
+ git reset --hard HEAD
HEAD is now at 2b181dc HIVE-21283: Create Synonym mid for substr, position for 
locate (Mani M, reviewed by Sankar Hariappan)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is behind 'origin/master' by 2 commits, and can be fast-forwarded.
  (use "git pull" to update your local branch)
+ git reset --hard origin/master
HEAD is now at 4c87512 HIVE-21430 : INSERT into a dynamically partitioned table 
with hive.stats.autogather = false throws a MetaException. (Ashutosh Bapat, 
reviewed by Mahesh Kumar Behera )
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-03-22 07:17:10.662
+ rm -rf ../yetus_PreCommit-HIVE-Build-16628
+ mkdir ../yetus_PreCommit-HIVE-Build-16628
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-16628
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-16628/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestReplicationScenariosIncrementalLoadAcidTables.java:
 does not exist in index
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestStatsReplicationScenarios.java:
 does not exist in index
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestStatsReplicationScenariosNoAutogather.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/ddl/table/CreateTableDesc.java: 
does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/ddl/table/CreateTableOperation.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/ColumnStatsUpdateTask.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: does not 
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/MoveTask.java: does not 
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/ReplDumpTask.java: 
does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/events/filesystem/FSTableEvent.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/io/AcidUtils.java: does not 
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: does 
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/DummyTxnManager.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManager.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java: does not 
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/metadata/Table.java: does not 
exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/dump/TableExport.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/dump/events/UpdatePartColStatHandler.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/dump/events/UpdateTableColStatHandler.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/UpdatePartColStatHandler.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/UpdateTableColStatHandler.java:
 does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/AddPartitionDesc.java: does 
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/ColumnStatsUpdateWork.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/ImportTableDesc.java: does 
not exist in index
error: 
a/ql/src/test/org/apache/hadoop/hive/ql/stats/TestStatsUpdaterThread.java: does 
not exist in index
error: 
a/standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/IMetaStoreClient.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-common/src/main/java/org/apache/hadoop/hive/metastore/txn/TxnCommonUtils.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/HiveMetaStore.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/ObjectStore.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/cache/CachedStore.java:
 does not exist in index
error: 
a/standalone-metastore/metastore-server/src/test/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClientPreCatalog.java:
 does not exist in index
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java:1882
Falling back to three-way merge...
Applied patch to 'ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java' 
cleanly.
Going to apply patch with: git apply -p1
error: patch failed: 
ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java:1882
Falling back to three-way merge...
Applied patch to 'ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java' 
cleanly.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc6032670875205440835.exe, --version]
protoc-jar: executing: [/tmp/protoc6032670875205440835.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
libprotoc 2.5.0
ANTLR Parser Generator  Version 3.5.2
protoc-jar: executing: [/tmp/protoc4800287762379912287.exe, --version]
libprotoc 2.5.0
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 41 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
Processing annotations
Annotations processed
Processing annotations
No elements to process
[ERROR] COMPILATION ERROR : 
[ERROR] 
/data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/stats/TestStatsUpdaterThread.java:[319,15]
 method testTxnDynamicPartitions() is already defined in class 
org.apache.hadoop.hive.ql.stats.TestStatsUpdaterThread
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.6.1:testCompile 
(default-testCompile) on project hive-exec: Compilation failure
[ERROR] 
/data/hiveptest/working/apache-github-source-source/ql/src/test/org/apache/hadoop/hive/ql/stats/TestStatsUpdaterThread.java:[319,15]
 method testTxnDynamicPartitions() is already defined in class 
org.apache.hadoop.hive.ql.stats.TestStatsUpdaterThread
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-exec
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-16628
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12963371 - PreCommit-HIVE-Build

> Stats replication for ACID tables.
> ----------------------------------
>
>                 Key: HIVE-21109
>                 URL: https://issues.apache.org/jira/browse/HIVE-21109
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Ashutosh Bapat
>            Assignee: Ashutosh Bapat
>            Priority: Major
>         Attachments: HIVE-21109.01.patch, HIVE-21109.02.patch, 
> HIVE-21109.03.patch
>
>
> Transactional tables require a writeID associated with the stats update. This 
> writeId needs to be in sync with the writeId on the source and hence needs to 
> be replicated from the source.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to