You could create a har archive of the small files and then pass the
corresponding har filesystem as input to your mapreduce job. Would that
work?


On 9/3/08 4:24 PM, "Dmitry Pushkarev" <[EMAIL PROTECTED]> wrote:

> Not quite, I want to be able to create har archives on local system and then
> send them to HDFS, and back since I work with many small files (10kb) and
> hadoop seem to behave poorly with them.
> 
> Perhaps HBASE is another option. Is anyone using it in "production" mode?
> And do I really need to downgrade to 17.x to install it?
> 
> -----Original Message-----
> From: Devaraj Das [mailto:[EMAIL PROTECTED]
> Sent: Wednesday, September 03, 2008 3:35 AM
> To: core-user@hadoop.apache.org
> Subject: Re: har/unhar utility
> 
> Are you looking for user documentation on har? If so, here it is:
> http://hadoop.apache.org/core/docs/r0.18.0/hadoop_archives.html
> 
> 
> On 9/3/08 3:21 PM, "Dmitry Pushkarev" <[EMAIL PROTECTED]> wrote:
> 
>> Does anyone have har/unhar utility?
>> 
>>  Or at least format description: It looks pretty obvious though, but just
> in
>> case.
>> 
>>  
>> 
>> Thanks
>> 
>> 
>> 
>> 
> 
> 


Reply via email to