http://mapredit.blogspot.com
>
> On Feb 7, 2012, at 11:03 AM, Xiaobin She wrote:
>
> > Hi all,
> >
> > Sorry if it is not appropriate to send one thread into two maillist.
> > **
> > I'm tring to use hadoop and hive to do some log analytic jobs.
> >
Hi all,
Sorry if it is not appropriate to send one thread into two maillist.
**
I'm tring to use hadoop and hive to do some log analytic jobs.
Our system generate lots of logs every day, for example, it produce about
370GB logs(including lots of log files) yesterday, and every day the logs
increa
uired compression codec
> -write to CompressionOutputStream.
>
> You should get a well detailed explanation on the same from the book
> 'Hadoop - The definitive guide' by Tom White.
> Regards
> Bejoy K S
>
> From handheld, Please excuse typos.
> -
look at the Flume project from Cloudera. I use it
> for writing data into HDFS.
>
> https://ccp.cloudera.com/display/SUPPORT/Downloads
>
> dave
>
> 2012/2/6 Xiaobin She
>
> > hi Bejoy ,
> >
> > thank you for your reply.
> >
> > actually I have
acheLog
> Input Format already)
>
>
> Regards
> Bejoy K S
>
> From handheld, Please excuse typos.
>
> -Original Message-
> From: Xiaobin She
> Date: Mon, 6 Feb 2012 16:41:50
> To: ; 佘晓彬
> Reply-To: common-user@hadoop.apache.org
> Subject: Re: Can
sorry, this sentence is wrong,
I can't compress these logs every hour and them put them into hdfs.
it should be
I can compress these logs every hour and them put them into hdfs.
2012/2/6 Xiaobin She
>
> hi all,
>
> I'm testing hadoop and hive, and I want to us
hi all,
I'm testing hadoop and hive, and I want to use them in log analysis.
Here I have a question, can I write/append log to an compressed file which
is located in hdfs?
Our system generate lots of log files every day, I can't compress these
logs every hour and them put them into hdfs.
But w