Hi,

The Java API offers a DistributedCache class which lets you do this.
The usage is detailed at
http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/filecache/DistributedCache.html

On Fri, May 4, 2012 at 5:11 PM, i...@christianherta.de
<i...@christianherta.de> wrote:
> Hello,
> I have written a chain of map-reduce jobs which creates a Mapfile. I want
> to use the Mapfile in a proximate map-reduce job via distributed cache.
> Therefore I have to create an archive file of the folder with holds the
> /data and /index files.
>
> In the documentation and in the Book "Hadoop the definite guide" there are
> only examples how this is done on the command line. Is this possible in
> HDFS via the Hadoop Java Api, too?
>
> P.S.: To distribute the files separately is not a solution. They would go
> in different temporary folders.
>
> Thanks in advance
> Christian



-- 
Harsh J

Reply via email to