Hi,
   Please configure the following in core-ste.xml and try.
   Use hadoop fs -ls file:///  -- to display local file system files
   Use hadoop fs -ls ftp://<your ftp location>   -- to display ftp files if
it is listing files go for distcp.

reference from
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/core-default.xml


fs.ftp.host0.0.0.0FTP filesystem connects to this serverfs.ftp.host.port21FTP
filesystem connects to fs.ftp.host on this port
and try to set the property also

reference from hadoop definitive guide hadoop file system.

Filesystem     URI scheme         Java implementation
Description
                                          (all under org.apache.hadoop)

FTP                 ftp                     fs.ftp.FTPFileSystem
    A filesystem backed by an FTP server.


Hi,



From,
Ramesh.




On Fri, Jul 12, 2013 at 1:04 PM, Hao Ren <h....@claravista.fr> wrote:

> Le 11/07/2013 20:47, Balaji Narayanan (பாலாஜி நாராயணன்) a écrit :
>
>> multiple copy jobs to hdfs
>>
>
> Thank you for your reply and the link.
>
> I read the link before, but I didn't find any examples about copying file
> from ftp to hdfs.
>
> There are about 20-40 file in my directory. I just want to move or copy
> that directory to hdfs on Amazon EC2.
>
> Actually, I am new to hadoop. I would like to know how to do multiple copy
> jobs to hdfs without distcp.
>
> Thank you again.
>
>
> --
> Hao Ren
> ClaraVista
> www.claravista.fr
>

Reply via email to