Re: NotYetReplicated exceptions when pushing large files into HDFS

2008-09-23 Thread Lohit
As of now, no. It's fixed for 3 retries. The ides is, if your dfs put fails 
even after three retries then there is something wrong which needs to be seen.

Lohit

On Sep 23, 2008, at 4:24 AM, "Ryan LeCompte" <[EMAIL PROTECTED]> wrote:

Thanks. Is there a way to increase the retry amount?

Ryan

On Mon, Sep 22, 2008 at 8:21 PM, lohit <[EMAIL PROTECTED]> wrote:
Yes, these are warning unless they fail for 3 times. In which case your dfs 
-put command would fail with stack trace.
Thanks,
Lohit



- Original Message 
From: Ryan LeCompte <[EMAIL PROTECTED]>
To: "core-user@hadoop.apache.org" 
Sent: Monday, September 22, 2008 5:18:01 PM
Subject: Re: NotYetReplicated exceptions when pushing large files into HDFS

I've noticed that although I get a few of these exceptions, the file
is ultimately uploaded to the HDFS cluster. Does this mean that my
file ended up getting there in 1 piece? The exceptions are just logged
at the WARN level and indicate retry attempts.

Thanks,
Ryan


On Mon, Sep 22, 2008 at 11:08 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:
Hello all,

I'd love to be able to upload into HDFS very large files (e.g., 8 or
10GB), but it seems like my only option is to chop up the file into
smaller pieces. Otherwise, after a while I get NotYetReplication
exceptions while the transfer is in progress. I'm using 0.18.1. Is
there any way I can do this? Perhaps use something else besides
bin/hadoop -put input output?

Thanks,
Ryan






Re: NotYetReplicated exceptions when pushing large files into HDFS

2008-09-23 Thread Ryan LeCompte
Thanks. Is there a way to increase the retry amount?

Ryan

On Mon, Sep 22, 2008 at 8:21 PM, lohit <[EMAIL PROTECTED]> wrote:
> Yes, these are warning unless they fail for 3 times. In which case your dfs 
> -put command would fail with stack trace.
> Thanks,
> Lohit
>
>
>
> - Original Message 
> From: Ryan LeCompte <[EMAIL PROTECTED]>
> To: "core-user@hadoop.apache.org" 
> Sent: Monday, September 22, 2008 5:18:01 PM
> Subject: Re: NotYetReplicated exceptions when pushing large files into HDFS
>
> I've noticed that although I get a few of these exceptions, the file
> is ultimately uploaded to the HDFS cluster. Does this mean that my
> file ended up getting there in 1 piece? The exceptions are just logged
> at the WARN level and indicate retry attempts.
>
> Thanks,
> Ryan
>
>
> On Mon, Sep 22, 2008 at 11:08 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:
>> Hello all,
>>
>> I'd love to be able to upload into HDFS very large files (e.g., 8 or
>> 10GB), but it seems like my only option is to chop up the file into
>> smaller pieces. Otherwise, after a while I get NotYetReplication
>> exceptions while the transfer is in progress. I'm using 0.18.1. Is
>> there any way I can do this? Perhaps use something else besides
>> bin/hadoop -put input output?
>>
>> Thanks,
>> Ryan
>>
>
>


Re: NotYetReplicated exceptions when pushing large files into HDFS

2008-09-22 Thread lohit
Yes, these are warning unless they fail for 3 times. In which case your dfs 
-put command would fail with stack trace.
Thanks,
Lohit



- Original Message 
From: Ryan LeCompte <[EMAIL PROTECTED]>
To: "core-user@hadoop.apache.org" 
Sent: Monday, September 22, 2008 5:18:01 PM
Subject: Re: NotYetReplicated exceptions when pushing large files into HDFS

I've noticed that although I get a few of these exceptions, the file
is ultimately uploaded to the HDFS cluster. Does this mean that my
file ended up getting there in 1 piece? The exceptions are just logged
at the WARN level and indicate retry attempts.

Thanks,
Ryan


On Mon, Sep 22, 2008 at 11:08 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:
> Hello all,
>
> I'd love to be able to upload into HDFS very large files (e.g., 8 or
> 10GB), but it seems like my only option is to chop up the file into
> smaller pieces. Otherwise, after a while I get NotYetReplication
> exceptions while the transfer is in progress. I'm using 0.18.1. Is
> there any way I can do this? Perhaps use something else besides
> bin/hadoop -put input output?
>
> Thanks,
> Ryan
>



Re: NotYetReplicated exceptions when pushing large files into HDFS

2008-09-22 Thread Ryan LeCompte
I've noticed that although I get a few of these exceptions, the file
is ultimately uploaded to the HDFS cluster. Does this mean that my
file ended up getting there in 1 piece? The exceptions are just logged
at the WARN level and indicate retry attempts.

Thanks,
Ryan


On Mon, Sep 22, 2008 at 11:08 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote:
> Hello all,
>
> I'd love to be able to upload into HDFS very large files (e.g., 8 or
> 10GB), but it seems like my only option is to chop up the file into
> smaller pieces. Otherwise, after a while I get NotYetReplication
> exceptions while the transfer is in progress. I'm using 0.18.1. Is
> there any way I can do this? Perhaps use something else besides
> bin/hadoop -put input output?
>
> Thanks,
> Ryan
>


NotYetReplicated exceptions when pushing large files into HDFS

2008-09-22 Thread Ryan LeCompte
Hello all,

I'd love to be able to upload into HDFS very large files (e.g., 8 or
10GB), but it seems like my only option is to chop up the file into
smaller pieces. Otherwise, after a while I get NotYetReplication
exceptions while the transfer is in progress. I'm using 0.18.1. Is
there any way I can do this? Perhaps use something else besides
bin/hadoop -put input output?

Thanks,
Ryan