See <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/252/changes>
Changes: [namit] undo HIVE-883. URISyntaxException when partition value contains special chars (Zheng Shao via namit) The tests were failing - committed by mistake, looked at the wrong results file [namit] URISyntaxException when partition value contains special chars (Zheng Shao via namit) [namit] Better error messages for debugging serde problem at reducer input (Zheng Shao via namit) [dhruba] use the Http POST method to submit the kill command to Hadoop JT. ------------------------------------------ [...truncated 2897 lines...] init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ivy/ivysettings.xml> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;work...@minerva.apache.org [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 13ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/2ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: common [javac] Compiling 5 source files to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/classes> jar: [echo] Jar: common [jar] Building jar: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/hive_common.jar> create-dirs: [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/classes> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test/src> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/test/classes> compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 221 source files to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/classes> [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. jar: [echo] Jar: serde [jar] Building jar: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/hive_serde.jar> create-dirs: [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test/src> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/test/classes> compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: model-compile: [javac] Compiling 8 source files to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes> [copy] Copying 1 file to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes> core-compile: [echo] Compiling: [javac] Compiling 31 source files to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes> [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. model-enhance: [datanucleusenhancer] log4j:WARN No appenders could be found for logger (DataNucleus.Enhancer). [datanucleusenhancer] log4j:WARN Please initialize the log4j system properly. [datanucleusenhancer] DataNucleus Enhancer (version 1.1.2) : Enhancement of classes [datanucleusenhancer] DataNucleus Enhancer : Classpath [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/classes> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/classes> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/classes> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/shims/classes> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/hadoopcore/hadoop-0.18.3/hadoop-0.18.3-core.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/anttasks/hive_anttasks.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/common/hive_common.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/serde/hive_serde.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/shims/hive_shims.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/asm-3.1.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-cli-2.0-SNAPSHOT.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-codec-1.3.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-collections-3.2.1.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-lang-2.4.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-logging-1.0.4.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/commons-logging-api-1.0.4.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/datanucleus-core-1.1.2.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/datanucleus-enhancer-1.1.2.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/datanucleus-rdbms-1.1.2.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/derby.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/jdo2-api-2.3-SNAPSHOT.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/json.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/libfb303.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/libthrift.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/log4j-1.2.15.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/lib/velocity-1.5.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/lib/antlr-2.7.7.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/lib/antlr-3.0.1.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/lib/antlr-runtime-3.0.1.jar> [datanucleusenhancer] >> <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/lib/stringtemplate-3.1b1.jar> [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition [datanucleusenhancer] DataNucleus Enhancer completed with success for 8 classes. Timings : input=251 ms, enhance=146 ms, total=397 ms. Consult the log for full details compile: jar: [echo] Jar: metastore [jar] Building jar: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/metastore/hive_metastore.jar> create-dirs: [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/classes> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/test> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/test/src> [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/test/classes> compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: ql-init: [mkdir] Created dir: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/gen-java/org/apache/hadoop/hive/ql/parse> build-grammar: [echo] Building Grammar <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g> .... [java] ANTLR Parser Generator Version 3.0.1 (August 13, 2007) 1989-2007 compile: [echo] Compiling: hive [javac] Compiling 448 source files to <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build/ql/classes> [javac] <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/ExecReducer.java>:201: cannot find symbol [javac] symbol : method getBytes() [javac] location: class org.apache.hadoop.io.BytesWritable [javac] Utilities.formatBinaryString(valueWritable.getBytes(), 0, valueWritable.getLength()) [javac] ^ [javac] <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/src/java/org/apache/hadoop/hive/ql/exec/ExecReducer.java>:201: cannot find symbol [javac] symbol : method getLength() [javac] location: class org.apache.hadoop.io.BytesWritable [javac] Utilities.formatBinaryString(valueWritable.getBytes(), 0, valueWritable.getLength()) [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] 2 errors BUILD FAILED <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build.xml>:138: The following error occurred while executing this line: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/build.xml>:89: The following error occurred while executing this line: <http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.18/ws/hive/ql/build.xml>:126: Compile failed; see the compiler error output for details. Total time: 22 seconds Recording test results