Hi Chris,

Instead of copying files . Use mv command .


   - hadoop fs -mv /user/hadoop/file1 /user/hadoop/file2


Sandeep.v


On Sat, Jan 9, 2016 at 9:55 AM, Chris Nauroth <cnaur...@hortonworks.com>
wrote:

> DistCp is capable of running large copies like this in distributed
> fashion, implemented as a MapReduce job.
>
> http://hadoop.apache.org/docs/r2.7.1/hadoop-distcp/DistCp.html
>
> A lot of the literature on DistCp talks about use cases for copying across
> different clusters, but it's also completely legitimate to run DistCp
> within the same cluster.
>
> --Chris Nauroth
>
> From: Gavin Yue <yue.yuany...@gmail.com>
> Date: Friday, January 8, 2016 at 4:45 PM
> To: "user@hadoop.apache.org" <user@hadoop.apache.org>
> Subject: how to quickly fs -cp dir with thousand files?
>
> I want to cp a dir with over 8000 files to another dir in the same hdfs.
> but the copy process is really slow since it is copying one by one.
> Is there a fast way to copy this using Java FileSystem or FileUtil api?
>
> Thanks.
>
>

Reply via email to