Hi.

 

I have a strange problem with hadoop when I run jobs under windows (my
laptop runs XP, but all cluster machines including namenode run Ubuntu).  I
run job (which runs perfectly under linux, and all configs and Java versions
are the same), all mappers  finishes successfully, and so does reducer but
when I tries to copy resulting file to the output directory I get things
like: 

 

 

03.10.2008 21:47:24 *INFO * audit:
ugi=Dmitry,mkpasswd,root,None,Administrators,Users
ip=/171.65.102.211                cmd=rename
src=/user/public/tmp/streaming-job12345/out48/_temporary/_attempt_2008100320
05_0013_r_000000_0/part-00000
dst=/user/public/tmp/streaming-job12345/out48/_temporary/_attempt_2008100320
05_0013_r_000000_0/part-00000    perm=Dmitry:supergroup:rw-r--r--
(FSNamesystem.java, line 94)

 

And then it deletes the file. And I get no output. 

Why does it renames the files into itself and does it have anything to do
with Path.getParent()?

 

Thanks.

Reply via email to