[ 
https://issues.apache.org/jira/browse/HIVE-20531?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16649197#comment-16649197
 ] 

Hive QA commented on HIVE-20531:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12943645/HIVE-20531.11.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/14424/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/14424/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-14424/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-10-14 01:39:08.105
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-14424/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-10-14 01:39:08.108
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 3527842 HIVE-20723: Allow per table specification of compaction 
yarn queue (Saurabh Seth via Eugene Koifman)
+ git clean -f -d
Removing ${project.basedir}/
Removing itests/${project.basedir}/
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 3527842 HIVE-20723: Allow per table specification of compaction 
yarn queue (Saurabh Seth via Eugene Koifman)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-10-14 01:39:09.191
+ rm -rf ../yetus_PreCommit-HIVE-Build-14424
+ mkdir ../yetus_PreCommit-HIVE-Build-14424
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-14424
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-14424/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
error: a/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java: does not 
exist in index
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestReplicationScenarios.java:
 does not exist in index
error: 
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestReplicationScenariosAcrossInstances.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadPartitions.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/table/LoadTable.java:
 does not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java: does 
not exist in index
error: 
a/ql/src/java/org/apache/hadoop/hive/ql/parse/ReplicationSemanticAnalyzer.java: 
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/LoadTableDesc.java: does 
not exist in index
Going to apply patch with: git apply -p1
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc4971415057757229953.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc4971415057757229953.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator  Version 3.5.2
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process 
(process-resource-bundles) on project hive-pre-upgrade: Execution 
process-resource-bundles of goal 
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process failed. 
ConcurrentModificationException -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-pre-upgrade
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-14424
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12943645 - PreCommit-HIVE-Build

> Repl load on cloud storage file system can skip redundant move or add 
> partition tasks.
> --------------------------------------------------------------------------------------
>
>                 Key: HIVE-20531
>                 URL: https://issues.apache.org/jira/browse/HIVE-20531
>             Project: Hive
>          Issue Type: Sub-task
>          Components: repl
>    Affects Versions: 4.0.0
>            Reporter: mahesh kumar behera
>            Assignee: mahesh kumar behera
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 4.0.0
>
>         Attachments: HIVE-20531.01.patch, HIVE-20531.02.patch, 
> HIVE-20531.03.patch, HIVE-20531.04.patch, HIVE-20531.05.patch, 
> HIVE-20531.06.patch, HIVE-20531.07.patch, HIVE-20531.08.patch, 
> HIVE-20531.09.patch, HIVE-20531.10.patch, HIVE-20531.11.patch
>
>
> In replication load, both add partition and insert operations are handled 
> through import. Import creates 3 major tasks. Copy, add partition and move. 
> Copy does the copy of data from source location to staging directory. Then 
> add partition (which runs in parallel to copy) creates the partition in meta 
> store. Its a no op in case of insert and by the time this ddl task is 
> executed for insert partition would be already present. The third operation 
> is move. Which actually moves the file from staging directory to actual 
> location. And then in case of insert it adds the insert event to notification 
> table. It does this for add partition operation which is redundant as the 
> event for add partition would have been written already by ddl task. With the 
> optimization to copy directly to actual table location in S3, move task can 
> be avoided for add partition operation replay and replay of insert need not 
> create the add partition (ddl) task.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to