roto,
>>>>>
>>>>> I didn't use the s3n filesystem.But from the output "cp:
>>>>> java.io.IOException: mkdirs: Pathname too long. Limit 8000 characters,
>>>>> 1000 levels.", I think this is because the problem of the path
From: * "Subroto"mailto:ssan...@datameer.com>>;
*Date: * Tue, Mar 5, 2013 10:22 PM
*To: * "user"mailto:user@hadoop.apache.org>>;
*Subject: * S3N copy creating recursive folders
Hi,
I am using Hadoop 1.0.3 and trying to execute:
h
characters.Why not count the last one's length?
>
> BRs//Julian
>
>
>
>
>
> -- Original --
> *From: * "Subroto";
> *Date: * Tue, Mar 5, 2013 10:22 PM
> *To: * "user"; **
> *Subject: * S3N copy creating recursive fold
gth?
>>>
>>> BRs//Julian
>>>
>>>
>>>
>>>
>>>
>>> -- Original --
>>> From: "Subroto";
>>> Date: Tue, Mar 5, 2013 10:22 PM
>>> To: "user";
>
th
>> longer than 8000 characters or the level is more than 1000?
>> You only have 998 folders.Maybe the last one is more than 8000
>> characters.Why not count the last one's length?
>>
>> BRs//Julian
>>
>>
>>
>>
>>
>> --
ulian
>
>
>
>
>
> -- Original --
> From: "Subroto";
> Date: Tue, Mar 5, 2013 10:22 PM
> To: "user";
> Subject: S3N copy creating recursive folders
>
> Hi,
>
> I am using Hadoop 1.0.3 and trying to exe
Hi,
I am using Hadoop 1.0.3 and trying to execute:
hadoop fs -cp s3n://acessKey:acesssec...@bucket.name/srcData" /test/srcData
This ends up with:
cp: java.io.IOException: mkdirs: Pathname too long. Limit 8000 characters,
1000 levels.
When I try to list the folder recursively /test/srcData: it