[
https://issues.apache.org/jira/browse/HIVE-7762?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14100849#comment-14100849
]
Hive QA commented on HIVE-7762:
-------------------------------
{color:red}Overall{color}: -1 no tests executed
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12662470/HIVE-7762.patch
Test results:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/382/testReport
Console output:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/382/console
Test logs:
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-382/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ [[ -n /usr/java/jdk1.7.0_45-cloudera ]]
+ export JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ JAVA_HOME=/usr/java/jdk1.7.0_45-cloudera
+ export
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-maven-3.0.5/bin:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+
PATH=/usr/java/jdk1.7.0_45-cloudera/bin/:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-maven-3.0.5/bin:/usr/local/apache-maven-3.0.5/bin:/usr/java/jdk1.6.0_34/bin:/usr/local/apache-ant-1.9.1/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hiveptest/bin
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost
-Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-TRUNK-Build-382/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'metastore/src/java/org/apache/hadoop/hive/metastore/ObjectStore.java'
Reverted
'metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStore.java'
Reverted 'common/src/java/org/apache/hadoop/hive/conf/HiveConf.java'
Reverted
'service/src/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java'
Reverted
'service/src/java/org/apache/hive/service/cli/thrift/ThriftHttpCLIService.java'
Reverted
'service/src/java/org/apache/hive/service/cli/thrift/ThriftBinaryCLIService.java'
Reverted 'service/src/java/org/apache/hive/service/cli/CLIService.java'
Reverted
'service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java'
Reverted
'service/src/java/org/apache/hive/service/cli/session/SessionManager.java'
Reverted
'service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java'
++ egrep -v '^X|^Performing status on external'
++ awk '{print $2}'
++ svn status --no-ignore
+ rm -rf target datanucleus.log ant/target shims/target shims/0.20/target
shims/0.20S/target shims/0.23/target shims/aggregator/target
shims/common/target shims/common-secure/target packaging/target
hbase-handler/target testutils/target jdbc/target metastore/target
itests/target itests/hcatalog-unit/target itests/test-serde/target
itests/qtest/target itests/hive-unit-hadoop2/target itests/hive-minikdc/target
itests/hive-unit/target itests/custom-serde/target itests/util/target
hcatalog/target hcatalog/core/target hcatalog/streaming/target
hcatalog/server-extensions/target hcatalog/hcatalog-pig-adapter/target
hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hwi/target
common/target common/src/gen service/target
service/src/java/org/apache/hive/service/server/ThreadWithGarbageCleanup.java
service/src/java/org/apache/hive/service/server/ThreadFactoryWithGarbageCleanup.java
contrib/target serde/target beeline/target odbc/target cli/target
ql/dependency-reduced-pom.xml ql/target
+ svn update
U ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java
U
testutils/ptest2/src/main/java/org/apache/hive/ptest/execution/conf/TestParser.java
Fetching external item into 'hcatalog/src/test/e2e/harness'
Updated external to revision 1618663.
Updated to revision 1618663.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0, p1, or p2
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12662470
> Enhancement while getting partitions via webhcat client
> -------------------------------------------------------
>
> Key: HIVE-7762
> URL: https://issues.apache.org/jira/browse/HIVE-7762
> Project: Hive
> Issue Type: Improvement
> Components: WebHCat
> Reporter: Suhas Vasu
> Priority: Minor
> Attachments: HIVE-7762.patch
>
>
> Hcatalog creates partitions in lower case, whereas getting partitions from
> hcatalog via webhcat client doesn't handle this. So the client starts
> throwing exceptions.
> Ex:
> CREATE EXTERNAL TABLE in_table (word STRING, cnt INT) PARTITIONED BY (Year
> STRING, Month STRING, Date STRING, Hour STRING, Minute STRING) STORED AS
> TEXTFILE LOCATION '/user/suhas/hcat-data/in/';
> Then i try to get partitions by:
> {noformat}
> String inputTableName = "in_table";
> String database = "default";
> Map<String, String> partitionSpec = new HashMap<String, String>();
> partitionSpec.put("Year", "2014");
> partitionSpec.put("Month", "08");
> partitionSpec.put("Date", "11");
> partitionSpec.put("Hour", "00");
> partitionSpec.put("Minute", "00");
> HCatClient client = get(catalogUrl);
> HCatPartition hCatPartition = client.getPartition(database,
> inputTableName, partitionSpec);
> {noformat}
> This throws up saying:
> {noformat}
> Exception in thread "main" org.apache.hcatalog.common.HCatException : 9001 :
> Exception occurred while processing HCat request : Invalid partition-key
> specified: year
> at
> org.apache.hcatalog.api.HCatClientHMSImpl.getPartition(HCatClientHMSImpl.java:366)
> at com.inmobi.demo.HcatPartitions.main(HcatPartitions.java:34)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> {noformat}
> The same code works if i do
> {noformat}
> partitionSpec.put("year", "2014");
> partitionSpec.put("month", "08");
> partitionSpec.put("date", "11");
> partitionSpec.put("hour", "00");
> partitionSpec.put("minute", "00");
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.2#6252)