[ 
https://issues.apache.org/jira/browse/AVRO-1567?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14112007#comment-14112007
 ] 

Tony Reix commented on AVRO-1567:
---------------------------------

1) Avro needs... Avro.

After a complete clean (in my Avro directory and ni my Maven .m2 repository), 
I've noticed that some Avro Java parts need some other Avro parts.
Example:
   [INFO] Building Apache Avro Tools 1.7.4
   [INFO] 
------------------------------------------------------------------------
   Downloading: 
http://repo.maven.apache.org/maven2/org/apache/avro/trevni-avro/1.7.4/trevni-avro-1.7.4.jar

I think that this may lead to issues in environments different than the 
"classic" one (non-IBM JVM + x86_64 machine).
Is it possible to have the needed parts of Avro being compiled before the parts 
of Avro that need them ?


2) build.sh is hadoop1 by default.

Since mvn commands dealing with lang/java do not contain any -P or -D option 
for specifying the Hadoop version, it is compiled for Hadoop 1 by default.
Compiling Avro with Hadoop 2 and by means of build.sh requires to modify this 
launcher.
For every task dealing with lang/java, I have added "-Phadoop2 
-Dhadoop.version=2 " to the mvn command that appears in build.sh .
(Moreover, I've added "-fn" when tests are run in order to have tests 
continuing after an error/failure is encountered)
It would be very useful to have some variable set at beginning of build.sh for 
using Hadoop2 by default.

> Avro java tools tests fail with IBM JVM
> ---------------------------------------
>
>                 Key: AVRO-1567
>                 URL: https://issues.apache.org/jira/browse/AVRO-1567
>             Project: Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.7.4, 1.7.7
>         Environment: RHEL 6.5 on x86_64
> IBM JVM 7.1.1.1
> HADOOP 2.4.1
>            Reporter: Tony Reix
>            Priority: Blocker
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> When using IBM JVM, compared to Oracle JVM, 25 of the Avro Tools tests fail.
> This is due to Avro using Hadoop which uses class:
>    org/apache/hadoop/security/UserGroupInformation.java
> which makes use of:
>    com.sun.security.auth.module.UnixLoginModule
> which does not exist in IBM JVM.
> Instead there is the class:
>    com.ibm.security.auth.module.LinuxLoginModule
> that can be used in UserGroupInformation.java if the JVM is IBM.
> With a IBM-JVM patched version of Hadoop that takes care of the kind of JVM, 
> these 25 Avro Java Tools tests still fail because the pom.xml file of:
>     lang/java/tools/
> says unconditionnaly (starting line 146 in Avro 1.7.7) :
>       <dependency>
>       <groupId>org.apache.hadoop</groupId>
>       <artifactId>hadoop-core</artifactId>
> Using:
>   mvn -Phadoop2 -Dhadoop.version=2 test
> is of no help.
> In fact, hadoop-core exists only for old Hadoop versions (here version 
> 0.20.205.0 is used by Avro), and not for Hadoop 2.4.1 .
> Replacing hadoop-core by hadoop-client in lang/java/tools/pom.xml file does 
> fix the issue, as a work-around.
> However, a more rigorous solution is required, like it is done in 
> lang/java/mapred/pom.xm , where hadoop-core is associated with hadoop1 and 
> hadoop-client is associated with hadoop2 .
> I'm not an expert of Maven/pom.xml, and the pom.xml file of tools contains 
> <exclusions> tags I have no idea. So, I'm not sure I can provide a correct 
> patch.
> I guess that a Maven/pom.xml expert should be able to fix this in some 
> minutes, plus testing.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to