[ 
https://issues.apache.org/jira/browse/HADOOP-1864?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12537750
 ] 

Milind Bhandarkar commented on HADOOP-1864:
-------------------------------------------

Yiping,

This is clearly out of scope for Hadoop because Java uses its own zlib that 
does not handle >2GB jar files. There is another Jira issue pending that will 
allow .tgz files as cache archives (HADOOP-2019). Would that meet your needs ?

> Support for big jar file (>2G)
> ------------------------------
>
>                 Key: HADOOP-1864
>                 URL: https://issues.apache.org/jira/browse/HADOOP-1864
>             Project: Hadoop
>          Issue Type: Bug
>          Components: contrib/streaming
>    Affects Versions: 0.14.1
>            Reporter: Yiping Han
>            Priority: Critical
>
> We have huge size binary that need to be distributed onto tasktracker nodes 
> in Hadoop streaming mode. We've tried both -file option and -cacheArchive 
> option. It seems the tasktracker node cannot unjar jar files bigger than 2G. 
> We are considering split our binaries into multiple jars, but with -file, it 
> seems we cannot do it. Also, we would prefer -cacheArchive option for 
> performance issue, but it seems -cacheArchive does not allow more than 
> appearance in the streaming options. Even if -cacheArchive support multiple 
> jars, we still need a way to put the jars into a single directory tree, 
> instead of using multiple symbolic links. 
> So, in general, we need a feasible and efficient way to update large size 
> (>2G) binaries for Hadoop streaming. Don't know if there is an existing 
> solution that we either didn't find or took it wrong. Or there should be some 
> extra work to provide a solution?

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to