[ https://issues.apache.org/jira/browse/HADOOP-10115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14352534#comment-14352534 ]
Allen Wittenauer commented on HADOOP-10115: ------------------------------------------- It's interesting that the test-patch failed yet while this patch failed on my machine... bizarre! In any case, both the copyfIfNotExists and copy functions need some surgery now that the fragile ls has been fixed. $child is a full path now, so we need to do something with it. I'm thinking the easiest is to do something like: {code} local $dir local $child if [ -d "$src" ]; then for dir in "$src"/* ; do child=${dir/${src}\/} {code} in both functions. This will restore $child to be the directory in a (I think) metachar safe-way as well as properly localizing those vars! Also, this line: {code} // we need to copy httpfs and kms as is {code} needs to have the // replaced with #. This is shell code, after all. :) Thanks! This is looking good! > Exclude duplicate jars in hadoop package under different component's lib > ------------------------------------------------------------------------ > > Key: HADOOP-10115 > URL: https://issues.apache.org/jira/browse/HADOOP-10115 > Project: Hadoop Common > Issue Type: Bug > Affects Versions: 3.0.0, 2.2.0 > Reporter: Vinayakumar B > Assignee: Vinayakumar B > Attachments: HADOOP-10115-004.patch, HADOOP-10115-005.patch, > HADOOP-10115.patch, HADOOP-10115.patch, HADOOP-10115.patch > > > In the hadoop package distribution there are more than 90% of the jars are > duplicated in multiple places. > For Ex: > almost all jars in share/hadoop/hdfs/lib are already there in > share/hadoop/common/lib > Same case for all other lib in share directory. > Anyway for all the daemon processes all directories are added to classpath. > So to reduce the package distribution size and the classpath overhead, remove > the duplicate jars from the distribution. -- This message was sent by Atlassian JIRA (v6.3.4#6332)