[ 
https://issues.apache.org/jira/browse/MAPREDUCE-2483?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13036892#comment-13036892
 ] 

Hudson commented on MAPREDUCE-2483:
-----------------------------------

Integrated in Hadoop-Mapreduce-trunk #685 (See 
[https://builds.apache.org/hudson/job/Hadoop-Mapreduce-trunk/685/])
    MAPREDUCE-2483. Remove duplication of jars between Hadoop subprojects
from build artifacts. (Eric Yang via omalley)

omalley : 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1125017
Files : 
* /hadoop/mapreduce/trunk/CHANGES.txt
* /hadoop/mapreduce/trunk/ivy.xml
* /hadoop/mapreduce/trunk/src/contrib/mumak/build.xml
* /hadoop/mapreduce/trunk/build.xml


> Clean up duplication of dependent jar files
> -------------------------------------------
>
>                 Key: MAPREDUCE-2483
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-2483
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.23.0
>            Reporter: Eric Yang
>            Assignee: Eric Yang
>             Fix For: 0.23.0
>
>         Attachments: MAPREDUCE-2483.patch
>
>
> For trunk, the build and deployment tree look like this:
> hadoop-common-0.2x.y
> hadoop-hdfs-0.2x.y
> hadoop-mapred-0.2x.y
> Technically, mapred's the third party dependent jar files should be fetch 
> from hadoop-common and hadoop-hdfs.  However, it is currently fetching from 
> hadoop-mapred/lib only.  It would be nice to eliminate the need to repeat 
> duplicated jar files at build time.
> There are two options to manage this dependency list, continue to enhance ant 
> build structure to fetch and filter jar file dependencies using ivy.  On the 
> other hand, it would be a good opportunity to convert the build structure to 
> maven, and use maven to manage the provided jar files.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to