to Firefox. It does work.
However,//it doesn't work with:
bin/hadoop -ls ftp://my ftp location
Any workaround here ?
Thank you.
Hao
Le 16/07/2013 17:47, Hao Ren a écrit :
Hi,
Actually, I test with my own ftp host at first, however it doesn't work.
Then I changed it into 0.0.0.0.
But I
,
From,
Ramesh.
On Mon, Jul 15, 2013 at 3:22 PM, Hao Ren h@claravista.fr
mailto:h@claravista.fr wrote:
Thank you, Ram
I have configured core-site.xml as following:
?xml version=1.0?
?xml-stylesheet type=text/xsl href=configuration.xsl?
!-- Put site-specific property
FTP filesystem connects to this server
fs.ftp.host.port21 FTP filesystem connects to fs.ftp.host on this
port
--
Hao Ren
ClaraVista
www.claravista.fr
or copy
that directory to hdfs on Amazon EC2.
Actually, I am new to hadoop. I would like to know how to do multiple
copy jobs to hdfs without distcp.
Thank you again.
--
Hao Ren
ClaraVista
www.claravista.fr
by:
$ bin/hadoop dfs -ls ftp://username:passwd@hostname/some/path/
It ends with:
ls: Cannot access ftp://username:passwd@hostname/some/path/: No
such file or directory.
That seems the same pb.
Any workaround here ?
Thank you in advance.
Hao.
--
Hao Ren
ClaraVista
www.claravista.fr