OK. Now I understand that this error is due to missing the Hadoop native library. If I manually add "libhadoop.so" into java.library.path for this unit test, it passed. So either the hadoop 2.2.0 coming from Maven reponsitory includes 32bit of hadoop native library, or totally missed it. Now the question is what is the correct way to run the unit tests in the new maven build? Thanks Yong
From: java8...@hotmail.com To: user@hive.apache.org Subject: Hive trunk unit test failed Date: Wed, 26 Feb 2014 14:49:41 -0500 Hi, I tried to run the all tests in my local Linux x64 of current Hive trunk code. My "mvn clean package -DskipTests -Phadoop-2 -Pdist" will work fine if I skip tests. The following unit test failed, and then it stopped. I traced the code down to a native method invoked at"org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native Method)" throw InvocationTargetException. My questions are: 1) Did it mean the native code not available in my environment causing the above error?2) If so, since the latest hive build is using Maven, and I can see the hadoop-2.2.0 all jar files downloaded in my local repository, why this error still happen?3) Is it possible that because of my local environment is 64bit, but default hadoop-2.2.0 coming with 32bit native code? If so, how to fix that during the hive build? Thanks Yong Running org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtilsTests run: 8, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.802 sec <<< FAILURE! - in org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtilsdetemineSchemaTriesToOpenUrl(org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils) Time elapsed: 0.377 sec <<< ERROR!java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native Method) at org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49) at org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129) at org.apache.hadoop.security.Groups.<init>(Groups.java:55) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:182) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:235) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2590) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2582) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2448) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) at org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.getSchemaFromFS(AvroSerdeUtils.java:110) at org.apache.hadoop.hive.serde2.avro.AvroSerdeUtils.determineSchemaOrThrowException(AvroSerdeUtils.java:71) at org.apache.hadoop.hive.serde2.avro.TestAvroSerdeUtils.detemineSchemaTriesToOpenUrl(TestAvroSerdeUtils.java:139)