hich has same one million plus 1000
> >> > new
> >> > records added in end of the file.
> >> > Here i just want to move the 1000 records alone into HDFS instead of
> >> > overwriting the entire file.
> >> >
> >> > Can i use HBas
records alone into HDFS instead of
>> > overwriting the entire file.
>> >
>> > Can i use HBase for this scenario? i don't have clear idea about HBase.
>> > Just
>> > asking.
>> >
>> >
>> >
>> >
>> > Thanks,
>> > Manickam P
&
27;t have clear idea about HBase.
> Just
> > asking.
> >
> >
> >
> >
> > Thanks,
> > Manickam P
> >
> >
> >> From: ha...@cloudera.com
> >> Date: Fri, 5 Jul 2013 16:13:16 +0530
> >
> >> Subject: Re: How to update a file
ng the entire file.
>
> Can i use HBase for this scenario? i don't have clear idea about HBase. Just
> asking.
>
>
>
>
> Thanks,
> Manickam P
>
>
>> From: ha...@cloudera.com
>> Date: Fri, 5 Jul 2013 16:13:16 +0530
>
>> Subject: Re: How to u
of overwriting the
entire file.
Can i use HBase for this scenario? i don't have clear idea about HBase. Just
asking.
Thanks,
Manickam P
> From: ha...@cloudera.com
> Date: Fri, 5 Jul 2013 16:13:16 +0530
> Subject: Re: How to update a file which is in HDFS
> To: user@hadoop.ap
The answer to the "delta" part is more that HDFS does not presently
support random writes. You cannot alter a closed file for anything
other than appending at the end, which I doubt will help you if you
are also receiving updates (it isn't clear from your question what
this added data really is).
atter, you will need more complex logic.
>>
>>
>>
>> John
>>
>>
>>
>>
>>
>> From: Mohammad Tariq [mailto:donta...@gmail.com]
>> Sent: Thursday, July 04, 2013 5:47 AM
>> To: user@hadoop.apache.org
>
ecords a the end of the file? Or can they be anywhere in the file? If
> the latter, you will need more complex logic.
>
>
>
> John
>
>
>
>
>
> From: Mohammad Tariq [mailto:donta...@gmail.com]
> Sent: Thursday, July 04, 2013 5:47 AM
> To: user@hado
If the latter, you will need more complex logic.***
> *
>
> ** **
>
> John
>
> ** **
>
> ** **
>
> *From:* Mohammad Tariq [mailto:donta...@gmail.com]
> *Sent:* Thursday, July 04, 2013 5:47 AM
> *To:* user@hadoop.apache.org
> *Subject:* Re: How to update a file which is in HDFS**
:47 AM
To: user@hadoop.apache.org
Subject: Re: How to update a file which is in HDFS
Hello Manickam,
Append is currently not possible.
Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>
On Thu, Jul 4, 2013 at 4:40 PM, Manickam P
mailto:manicka...@outlook.com
You can append using WebHDFS.. Following link may help you--
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#Append_to_a_File
On Thu, Jul 4, 2013 at 5:17 PM, Mohammad Tariq wrote:
> Hello Manickam,
>
> Append is currently not possible.
>
> Warm Regards
Hello Manickam,
Append is currently not possible.
Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, Jul 4, 2013 at 4:40 PM, Manickam P wrote:
> Hi,
>
> I have moved my input file into the HDFS location in the cluster setup.
> Now i got a new set of file which has some new records al
Hi,
I have moved my input file into the HDFS location in the cluster setup. Now i
got a new set of file which has some new records along with the old one. I want
to move the delta part alone into HDFS because it will take more time to move
the file from my local to HDFS location. Is it possible
13 matches
Mail list logo