te the ftp address to Firefox. It does work.
However,//it doesn't work with:
bin/hadoop -ls ftp://
Any workaround here ?
Thank you.
Hao
Le 16/07/2013 17:47, Hao Ren a écrit :
Hi,
Actually, I test with my own ftp host at first, however it doesn't work.
Then I changed it into 0.0.0.0.
and try it.
Hi,
From,
Ramesh.
On Mon, Jul 15, 2013 at 3:22 PM, Hao Ren <mailto:h@claravista.fr>> wrote:
Thank you, Ram
I have configured core-site.xml as following:
hadoop.tmp.dir
.apache.org/docs/current/hadoop-project-dist/hadoop-common/core-default.xml
fs.ftp.host 0.0.0.0 FTP filesystem connects to this server
fs.ftp.host.port21 FTP filesystem connects to fs.ftp.host on this
port
--
Hao Ren
ClaraVista
www.claravista.fr
ve or copy
that directory to hdfs on Amazon EC2.
Actually, I am new to hadoop. I would like to know how to do multiple
copy jobs to hdfs without distcp.
Thank you again.
--
Hao Ren
ClaraVista
www.claravista.fr
th by:
$ bin/hadoop dfs -ls ftp://username:passwd@hostname/some/path/
It ends with:
ls: Cannot access ftp://username:passwd@hostname/some/path/: No
such file or directory.
That seems the same pb.
Any workaround here ?
Thank you in advance.
Hao.
--
Hao Ren
ClaraVista
www.claravista.fr