Hello,
I have written a chain of map-reduce jobs which creates a Mapfile. I want
to use the Mapfile in a proximate map-reduce job via distributed cache.
Therefore I have to create an archive file of the folder with holds the
/data and /index files.

In the documentation and in the Book "Hadoop the definite guide" there are
only examples how this is done on the command line. Is this possible in
HDFS via the Hadoop Java Api, too?

P.S.: To distribute the files separately is not a solution. They would go
in different temporary folders.

Thanks in advance
Christian

Reply via email to