[
https://issues.apache.org/jira/browse/HIVE-3416?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444429#comment-13444429
]
Zhenxiao Luo commented on HIVE-3416:
------------------------------------
The problem is, TestAvroSerdeUtils has a dependency on hadoop-hdfs-*-tests.jar
in calling MiniDFSCluster.java, which has a dependency on
hadoop-common-*-tests.jar in calling StaticMapping.java.
These two dependency is not specified in serde/ivy.xml, and not included in
hadoop23.test:
hadoop-hdfs-*-tests.jar
hadoop-common-*-tests.jar
Just adding hadoop-hdfs-*-tests.jar does not implicitly add
hadoop-common-*-tests.jar, I propose to add both dependencies, and also updated
serde/ivy.xml first line in defining xml schema for maven, so that m:classifier
could be used.
Also, in build-common.xml, the following lines are in test.classpath definition:
<fileset dir="${build.ivy.lib.dir}/hadoop0.${hadoop.mr.rev}.shim"
includes="*.jar" erroronmissingdir="false" />
<pathelement location="${build.classes}" />
<!-- we strip out hadoop jars present in places other than the hadoop
shimmed dir-->
<fileset dir="${hive.root}/build/ivy/lib/test" includes="*.jar"
erroronmissingdir="false"
excludes="**/hive_*.jar,**/hive-*.jar,**/hadoop-*.jar"/>
<fileset dir="${hive.root}/build/ivy/lib/default" includes="*.jar"
erroronmissingdir="false"
excludes="**/hive_*.jar,**/hive-*.jar,**/hadoop-*.jar" />
Which excludes all the hadoop-*jars in build/ivy/lib/test directory from the
test.classpath.
So, adding the dependencies in serde/ivy.xml does not work. We need to add them
in shims/ivy.xml, which is included in the test.classpath.
> Fix TestAvroSerdeUtils.determineSchemaCanReadSchemaFromHDFS when running Hive
> on hadoop23
> -----------------------------------------------------------------------------------------
>
> Key: HIVE-3416
> URL: https://issues.apache.org/jira/browse/HIVE-3416
> Project: Hive
> Issue Type: Bug
> Affects Versions: 0.9.0
> Reporter: Zhenxiao Luo
> Assignee: Zhenxiao Luo
>
> TestAvroSerdeUtils determinSchemaCanReadSchemaFromHDFS is failing when
> running hive on hadoop23:
> $ant very-clean package -Dhadoop.version=0.23.1 -Dhadoop-0.23.version=0.23.1
> -Dhadoop.mr.rev=23
> $ant test -Dhadoop.version=0.23.1 -Dhadoop-0.23.version=0.23.1
> -Dhadoop.mr.rev=23 -Dtestcase=TestAvroSerdeUtils
> <testcase classname="org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils"
> name="determineSchemaCanReadSchemaFromHDFS" time="0.21">
> <error message="org/apache/hadoop/net/StaticMapping"
> type="java.lang.NoClassDefFoundError">java.lang.NoClassDefFoundError:
> org/apache/hadoop/net/StaticMapping
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:534)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:489)
> at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:360)
> at
> org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils.determineSchemaCanReadSchemaFromHDFS(TestAvroSerdeUtils.java:187)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
> at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
> at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
> at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
> at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
> at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
> at
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.net.StaticMapping
> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ... 25 more
> </error>
> </testcase>
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira