[ 
https://issues.apache.org/jira/browse/HIVE-17657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16460429#comment-16460429
 ] 

Hive QA commented on HIVE-17657:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12921519/HIVE-17657.07.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/10620/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/10620/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-10620/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-05-02 02:23:03.638
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-10620/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-05-02 02:23:03.641
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 4fd8f03 HIVE-19327: qroupby_rollup_empty.q fails for insert-only 
transactional tables (Steve Yeom reviewed by Sergey Shelukhin, Prasanth 
Jayachandran)
+ git clean -f -d
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 4fd8f03 HIVE-19327: qroupby_rollup_empty.q fails for insert-only 
transactional tables (Steve Yeom reviewed by Sergey Shelukhin, Prasanth 
Jayachandran)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-05-02 02:23:04.851
+ rm -rf ../yetus_PreCommit-HIVE-Build-10620
+ mkdir ../yetus_PreCommit-HIVE-Build-10620
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-10620
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-10620/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
/data/hiveptest/working/scratch/build.patch:243: trailing whitespace.
  
/data/hiveptest/working/scratch/build.patch:627: trailing whitespace.
    
/data/hiveptest/working/scratch/build.patch:916: trailing whitespace.
    
/data/hiveptest/working/scratch/build.patch:977: trailing whitespace.
  
/data/hiveptest/working/scratch/build.patch:1065: trailing whitespace.
# col_name              data_type               comment             
warning: squelched 48 whitespace errors
warning: 53 lines add whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q 
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc5010860018878937823.exe, --version]
protoc-jar: executing: [/tmp/protoc5010860018878937823.exe, 
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
 
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
libprotoc 2.5.0
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 40 classes.
ANTLR Parser Generator  Version 3.5.2
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2376:5: 
Decision can match input such as "KW_CHECK {KW_EXISTS, KW_TINYINT}" using 
multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2376:5: 
Decision can match input such as "KW_CHECK KW_STRUCT LESSTHAN" using multiple 
alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2376:5: 
Decision can match input such as "KW_CHECK KW_DATETIME" using multiple 
alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2376:5: 
Decision can match input such as "KW_CHECK KW_DATE {LPAREN, StringLiteral}" 
using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:2376:5: 
Decision can match input such as "KW_CHECK KW_UNIONTYPE LESSTHAN" using 
multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
Output file 
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java
 does not exist: must build 
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
Processing annotations
Annotations processed
Processing annotations
No elements to process
Processing annotations
Annotations processed
Processing annotations
No elements to process
[ERROR] Failed to execute goal on project hive-llap-server: Could not resolve 
dependencies for project org.apache.hive:hive-llap-server:jar:3.1.0-SNAPSHOT: 
Failed to collect dependencies for 
[org.apache.hive:hive-exec:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-common:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-common:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-client:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-llap-tez:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-shims:jar:3.1.0-SNAPSHOT (compile), 
org.apache.hive:hive-serde:jar:3.1.0-SNAPSHOT (compile), 
commons-codec:commons-codec:jar:1.7 (compile), 
commons-lang:commons-lang:jar:2.6 (compile), 
io.netty:netty-all:jar:4.1.17.Final (compile), io.netty:netty:jar:3.10.5.Final 
(compile), org.apache.avro:avro:jar:1.7.7 (compile), 
org.apache.thrift:libthrift:jar:0.9.3 (compile), com.tdunning:json:jar:1.8 
(compile), org.apache.hadoop:hadoop-common:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-yarn-services-core:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-mapreduce-client-core:jar:3.1.0 (compile?), 
org.apache.orc:orc-core:jar:1.4.3 (compile), 
org.apache.tez:tez-runtime-internals:jar:0.9.1 (compile?), 
org.apache.tez:tez-runtime-library:jar:0.9.1 (compile?), 
org.codehaus.jettison:jettison:jar:1.1 (compile), 
org.eclipse.jetty:jetty-server:jar:9.3.8.v20160314 (compile), 
org.eclipse.jetty:jetty-util:jar:9.3.8.v20160314 (compile), 
org.apache.hive:hive-standalone-metastore:jar:tests:3.1.0-SNAPSHOT (test), 
org.apache.hadoop:hadoop-common:jar:tests:3.1.0 (test), 
org.apache.hadoop:hadoop-hdfs:jar:3.1.0 (test), 
org.apache.hive:hive-llap-common:jar:tests:3.1.0-SNAPSHOT (compile), 
org.apache.hadoop:hadoop-hdfs:jar:tests:3.1.0 (test), junit:junit:jar:4.11 
(test), org.mockito:mockito-all:jar:1.10.19 (test), 
com.sun.jersey:jersey-servlet:jar:1.19 (test), 
org.apache.hbase:hbase-hadoop2-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-client:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-server:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-mapreduce:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-hadoop-compat:jar:2.0.0-alpha4 (compile), 
org.slf4j:slf4j-api:jar:1.7.10 (compile)]: Failed to read artifact descriptor 
for org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
(https://maven.java.net/content/repositories/snapshots): Failed to transfer 
file: 
https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
 Return code is: 402 , ReasonPhrase:Payment Required. -> [Help 1]
[ERROR] Failed to execute goal on project hive-hbase-handler: Could not resolve 
dependencies for project org.apache.hive:hive-hbase-handler:jar:3.1.0-SNAPSHOT: 
Failed to collect dependencies for 
[org.apache.hive:hive-exec:jar:3.1.0-SNAPSHOT (compile), 
commons-lang:commons-lang:jar:2.6 (compile), 
org.apache.hadoop:hadoop-common:jar:3.1.0 (compile?), 
org.apache.hadoop:hadoop-mapreduce-client-core:jar:3.1.0 (compile?), 
org.apache.hbase:hbase-hadoop2-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-client:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-server:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-mapreduce:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-hadoop-compat:jar:2.0.0-alpha4 (compile), 
org.apache.hadoop:hadoop-hdfs:jar:tests:3.1.0 (test), 
org.apache.hbase:hbase-hadoop2-compat:jar:tests:2.0.0-alpha4 (compile), 
org.apache.hbase:hbase-common:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-server:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-mapreduce:jar:tests:2.0.0-alpha4 (test), 
org.apache.hbase:hbase-hadoop-compat:jar:tests:2.0.0-alpha4 (test), 
org.eclipse.jetty:jetty-runner:jar:9.3.8.v20160314 (test), 
com.sun.jersey:jersey-servlet:jar:1.19 (test), 
org.apache.hadoop:hadoop-common:jar:tests:3.1.0 (test), junit:junit:jar:4.11 
(test), org.apache.avro:avro:jar:1.7.6 (compile), 
org.slf4j:slf4j-api:jar:1.7.10 (compile), org.mockito:mockito-all:jar:1.10.19 
(test)]: Failed to read artifact descriptor for 
org.glassfish:javax.el:jar:3.0.1-b06-SNAPSHOT: Could not transfer artifact 
org.glassfish:javax.el:pom:3.0.1-b06-SNAPSHOT from/to jvnet-nexus-snapshots 
(https://maven.java.net/content/repositories/snapshots): Failed to transfer 
file: 
https://maven.java.net/content/repositories/snapshots/org/glassfish/javax.el/3.0.1-b06-SNAPSHOT/javax.el-3.0.1-b06-SNAPSHOT.pom.
 Return code is: 402 , ReasonPhrase:Payment Required. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-llap-server
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12921519 - PreCommit-HIVE-Build

> export/import for MM tables is broken
> -------------------------------------
>
>                 Key: HIVE-17657
>                 URL: https://issues.apache.org/jira/browse/HIVE-17657
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Transactions
>            Reporter: Eugene Koifman
>            Assignee: Sergey Shelukhin
>            Priority: Major
>              Labels: mm-gap-2
>         Attachments: HIVE-17657.01.patch, HIVE-17657.02.patch, 
> HIVE-17657.03.patch, HIVE-17657.04.patch, HIVE-17657.05.patch, 
> HIVE-17657.06.patch, HIVE-17657.07.patch, HIVE-17657.patch
>
>
> there is mm_exim.q but it's not clear from the tests what file structure it 
> creates 
> On import the txnids in the directory names would have to be remapped if 
> importing to a different cluster.  Perhaps export can be smart and export 
> highest base_x and accretive deltas (minus aborted ones).  Then import can 
> ...?  It would have to remap txn ids from the archive to new txn ids.  This 
> would then mean that import is made up of several transactions rather than 1 
> atomic op.  (all locks must belong to a transaction)
> One possibility is to open a new txn for each dir in the archive (where 
> start/end txn of file name is the same) and commit all of them at once (need 
> new TMgr API for that).  This assumes using a shared lock (if any!) and thus 
> allows other inserts (not related to import) to occur.
> What if you have delta_6_9, such as a result of concatenate?  If we stipulate 
> that this must mean that there is no delta_6_6 or any other "obsolete" delta 
> in the archive we can map it to a new single txn delta_x_x.
> Add read_only mode for tables (useful in general, may be needed for upgrade 
> etc) and use that to make the above atomic.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to