[ 
https://issues.apache.org/jira/browse/HADOOP-5107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12759408#action_12759408
 ] 

Vinod K V commented on HADOOP-5107:
-----------------------------------

As for the patch review comments, I didn't go through each and every line so 
many nits may be missing the review.

- The patch doesn't work when we go off-line for subsequent runs. The off-line 
feature is missing in all the projects. Without this feature, it tries to 
download maven-ant-tasks.jar itself again and gets stuck.
- Minor: Wrap around lines longer than 80 characters.
- In many files, in particular the ivy.xml files of contrib projects, most of 
the changes are not required and are redundant as the patch removes them and 
simply adds them again changing the format into a single line. Undoing these 
changes will greatly reduce the patch size :)
 - In mapreduce and hdfs ivy.xml files, some cleanup is done. The earlier 
client and server specific dependencies looked good and natural too. Did you 
remove that because the classification was premature or it didn't gel well with 
your changes?
 - In all the project's build files, in setversion target, replaceregexp can be 
done in a single go for all the POMs. It takes a fileset and so doesn't need 
separate replaceregexp tasks.
 - Remove hadoop-core.pom file from common, it's no longer required.
 - Bump ivy.version to 2.1.0-rc1 in mapreduce and hdfs projects also? The patch 
bumps it for the common project.
 - mapreduce build.xml: Do we need separate mvn-install and mvn-install-mapred? 
Even if it is needed, mvn-install should depend on mvn-install-mapred. A case 
of reuse.
 - common project: Should we take this as an opportunity and rename the core 
jar to common jar before publishing? It looks odd the project name is common 
while the jar's name refers to core.
 - I think that in both mapred and hdfs, clean-cache should not delete the 
whole ${user.home}/.ivy2/cache/org.apache.hadoop/hadoop-core directory for 
example. It works for now, but different projects may work with different 
versions of the jar, so mapred's clean-cache should only delete the 
corresponding version of the jar. Same with the other directories in the cache. 
Thoughts?
 - Should `ant clean` delete maven-ant-tasks.jar every time? I guess not.
 - Add the pom files in ivy directory (e.g. ivy/hadoop-mapred-examples.xml) to 
svn/git ignore ?
 - As Sharad already commented, can we put in nice descriptions for the new 
targets? Of course, we will not need these for internal only targets like 
mvn-taskdef.

> split the core, hdfs, and mapred jars from each other and publish them 
> independently to the Maven repository
> ------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-5107
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5107
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: build
>    Affects Versions: 0.20.0
>            Reporter: Owen O'Malley
>            Assignee: Giridharan Kesavan
>         Attachments: common-trunk-v1.patch, common-trunk-v4.patch, 
> common-trunk.patch, hadoop-hdfsd-v4.patch, hdfs-trunk-v1.patch, 
> hdfs-trunk-v2.patch, hdfs-trunk.patch, mapred-trunk-v1.patch, 
> mapred-trunk-v2.patch, mapred-trunk-v3.patch, mapred-trunk-v4.patch, 
> mapred-trunk-v5.patch, mapreduce-trunk.patch
>
>
> I think to support splitting the projects, we should publish the jars for 
> 0.20.0 as independent jars to the Maven repository 

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to