i'm not sure that's the issue, i basically tarred up the hadoop directory from the cluster and copied it over to the non-data node but i do agree i've likely got a setting wrong, since i can run distcp from the namenode and it works fine. the question is which one
On Mon, Sep 8, 2008 at 7:04 PM, Aaron Kimball <[EMAIL PROTECTED]>wrote: > It is likely that you mapred.system.dir and/or fs.default.name settings > are > incorrect on the non-datanode machine that you are launching the task from. > These two settings (in your conf/hadoop-site.xml file) must match the > settings on the cluster itself. > > - Aaron > > On Sun, Sep 7, 2008 at 8:58 PM, Michael Di Domenico > <[EMAIL PROTECTED]>wrote: > > > I'm attempting to load data into hadoop (version 0.17.1), from a > > non-datanode machine in the cluster. I can run jobs and copyFromLocal > > works > > fine, but when i try to use distcp i get the below. I'm don't understand > > what the error, can anyone help? > > Thanks > > > > blue:hadoop-0.17.1 mdidomenico$ time bin/hadoop distcp -overwrite > > file:///Users/mdidomenico/hadoop/1gTestfile /user/mdidomenico/1gTestfile > > 08/09/07 23:56:06 INFO util.CopyFiles: > > srcPaths=[file:/Users/mdidomenico/hadoop/1gTestfile] > > 08/09/07 23:56:06 INFO util.CopyFiles: > > destPath=/user/mdidomenico/1gTestfile1 > > 08/09/07 23:56:07 INFO util.CopyFiles: srcCount=1 > > With failures, global counters are inaccurate; consider running with -i > > Copy failed: org.apache.hadoop.ipc.RemoteException: java.io.IOException: > > /tmp/hadoop-hadoop/mapred/system/job_200809072254_0005/job.xml: No such > > file > > or directory > > at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:215) > > at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:149) > > at > > org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1155) > > at > > org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1136) > > at > > org.apache.hadoop.mapred.JobInProgress.<init>(JobInProgress.java:175) > > at > > org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:1755) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) > > at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) > > at java.lang.reflect.Method.invoke(Unknown Source) > > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446) > > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896) > > > > at org.apache.hadoop.ipc.Client.call(Client.java:557) > > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212) > > at $Proxy1.submitJob(Unknown Source) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > > at > > > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > > at java.lang.reflect.Method.invoke(Method.java:585) > > at > > > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) > > at > > > > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) > > at $Proxy1.submitJob(Unknown Source) > > at > org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:758) > > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:973) > > at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:604) > > at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:743) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > > at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:763) > > >