[
https://issues.apache.org/jira/browse/HIVE-6185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13868799#comment-13868799
]
Hive QA commented on HIVE-6185:
-------------------------------
{color:red}Overall{color}: -1 no tests executed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12622492/HIVE-6185.patch
Test results:
http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/testReport
Console output:
http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/863/console
Messages:
{noformat}
**** This message was trimmed, see log for full details ****
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it ---
[INFO] Executing tasks
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it ---
[INFO] Executing tasks
main:
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/target/warehouse
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
[copy] Copying 5 files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/target/tmp/conf
[INFO] Executed tasks
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it ---
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/pom.xml to
/data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNAPSHOT/hive-it-0.13.0-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-serde
---
[INFO] Deleting
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde (includes
= [datanucleus.log, derby.log], excludes = [])
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @
hive-it-custom-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/main/resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @
hive-it-custom-serde ---
[INFO] Executing tasks
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @
hive-it-custom-serde ---
[INFO] Compiling 8 source files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/classes
[INFO]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @
hive-it-custom-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/src/test/resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-serde
---
[INFO] Executing tasks
main:
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/warehouse
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
[copy] Copying 5 files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/tmp/conf
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @
hive-it-custom-serde ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @
hive-it-custom-serde ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde ---
[INFO] Building jar:
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @
hive-it-custom-serde ---
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
to
/data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.jar
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/custom-serde/pom.xml to
/data/hive-ptest/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Integration - HCatalog Unit Tests 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-it-unit
---
[INFO] Deleting
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit (includes
= [datanucleus.log, derby.log], excludes = [])
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @
hive-hcatalog-it-unit ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/main/resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @
hive-hcatalog-it-unit ---
[INFO] Executing tasks
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @
hive-hcatalog-it-unit ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @
hive-hcatalog-it-unit ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/src/test/resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @
hive-hcatalog-it-unit ---
[INFO] Executing tasks
main:
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/warehouse
[mkdir] Created dir:
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf
[copy] Copying 5 files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/tmp/conf
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @
hive-hcatalog-it-unit ---
[INFO] Compiling 7 source files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/test-classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO]
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @
hive-hcatalog-it-unit ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-it-unit ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar:
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar
[INFO]
[INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-it-unit ---
[INFO] Building jar:
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @
hive-hcatalog-it-unit ---
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar
to
/data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.jar
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/pom.xml
to
/data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT.pom
[INFO] Installing
/data/hive-ptest/working/apache-svn-trunk-source/itests/hcatalog-unit/target/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar
to
/data/hive-ptest/working/maven/org/apache/hive/hive-hcatalog-it-unit/0.13.0-SNAPSHOT/hive-hcatalog-it-unit-0.13.0-SNAPSHOT-tests.jar
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/util
(includes = [datanucleus.log, derby.log], excludes = [])
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @
hive-it-util ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/resources
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util ---
[INFO] Executing tasks
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-util
---
[INFO] Compiling 41 source files to
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING :
[INFO] -------------------------------------------------------------
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 2 warnings
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR]
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18]
cannot find symbol
symbol : method getPartitionPath()
location: class org.apache.hadoop.hive.ql.metadata.Partition
[ERROR]
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18]
cannot find symbol
symbol : method getPartitionPath()
location: class org.apache.hadoop.hive.ql.metadata.Partition
[INFO] 2 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Hive Integration - Parent ......................... SUCCESS [4.620s]
[INFO] Hive Integration - Custom Serde ................... SUCCESS [11.056s]
[INFO] Hive Integration - HCatalog Unit Tests ............ SUCCESS [5.939s]
[INFO] Hive Integration - Testing Utilities .............. FAILURE [3.795s]
[INFO] Hive Integration - Unit Tests ..................... SKIPPED
[INFO] Hive Integration - Test Serde ..................... SKIPPED
[INFO] Hive Integration - QFile Tests .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27.238s
[INFO] Finished at: Sat Jan 11 09:30:55 EST 2014
[INFO] Final Memory: 28M/85M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on
project hive-it-util: Compilation failure: Compilation failure:
[ERROR]
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java:[47,18]
cannot find symbol
[ERROR] symbol : method getPartitionPath()
[ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition
[ERROR]
/data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java:[46,18]
cannot find symbol
[ERROR] symbol : method getPartitionPath()
[ERROR] location: class org.apache.hadoop.hive.ql.metadata.Partition
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hive-it-util
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12622492
> DDLTask is inconsistent in creating a table and adding a partition when
> dealing with location
> ---------------------------------------------------------------------------------------------
>
> Key: HIVE-6185
> URL: https://issues.apache.org/jira/browse/HIVE-6185
> Project: Hive
> Issue Type: Bug
> Components: Query Processor
> Affects Versions: 0.12.0
> Reporter: Xuefu Zhang
> Assignee: Xuefu Zhang
> Attachments: HIVE-6185.patch, HIVE-6185.patch
>
>
> When creating a table, Hive uses URI to represent location:
> {code}
> if (crtTbl.getLocation() != null) {
> tbl.setDataLocation(new Path(crtTbl.getLocation()).toUri());
> }
> {code}
> When adding a partition, Hive uses Path to represent location:
> {code}
> // set partition path relative to table
> db.createPartition(tbl, addPartitionDesc.getPartSpec(), new Path(tbl
> .getPath(), addPartitionDesc.getLocation()),
> addPartitionDesc.getPartParams(),
> addPartitionDesc.getInputFormat(),
> addPartitionDesc.getOutputFormat(),
> addPartitionDesc.getNumBuckets(),
> addPartitionDesc.getCols(),
> addPartitionDesc.getSerializationLib(),
> addPartitionDesc.getSerdeParams(),
> addPartitionDesc.getBucketCols(),
> addPartitionDesc.getSortCols());
> {code}
> This disparity makes the values stored in metastore be encoded differently,
> causing problems w.r.t. special character as demonstrated in HIVE-5446. As a
> result, the code dealing with location for table is different for partition,
> creating maintenance burden.
> We need to standardize it to Path to be in line with other Path related
> cleanup effort.
--
This message was sent by Atlassian JIRA
(v6.1.5#6160)