[ 
https://issues.apache.org/jira/browse/HADOOP-2129?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12541447
 ] 

chris.douglas edited comment on HADOOP-2129 at 11/9/07 1:38 PM:
----------------------------------------------------------------

Copy from A to B, by running distcp on B with -i (ignore read failures) 
generated the following exception trace (prolifically):

{noformat}
FAIL hdfs://namenode-of-B:8600/targetdir/targetfile : 
org.apache.hadoop.ipc.RemoteException: java.io.IOException: Cannot open 
filename /targetdir/targetfile
        at org.apache.hadoop.dfs.NameNode.open(NameNode.java:238)
        at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:379)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:596)

        at org.apache.hadoop.ipc.Client.call(Client.java:482)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:184)
        at org.apache.hadoop.dfs.$Proxy1.open(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at org.apache.hadoop.dfs.$Proxy1.open(Unknown Source)
        at 
org.apache.hadoop.dfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:848)
        at 
org.apache.hadoop.dfs.DFSClient$DFSInputStream.<init>(DFSClient.java:840)
        at org.apache.hadoop.dfs.DFSClient.open(DFSClient.java:285)
        at 
org.apache.hadoop.dfs.DistributedFileSystem.open(DistributedFileSystem.java:114)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:244)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.copy(CopyFiles.java:289)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:367)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:218)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
        at 
org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760)
{noformat}

All the directories were created successfully- so the src file list is readable 
at the destination- but none of the files could be opened at src.

The trace through o.a.h.u.CopyFiles doesn't match what's in the repository, 
though. Is this running with any custom patches?

      was (Author: chris.douglas):
    Copy from A to B, by running distcp on B with -i (ignore read failures) 
generated the following exception trace (prolifically):

{{noformat}}
FAIL hdfs://namenode-of-B:8600/targetdir/targetfile : 
org.apache.hadoop.ipc.RemoteException: java.io.IOException: Cannot open 
filename /targetdir/targetfile
        at org.apache.hadoop.dfs.NameNode.open(NameNode.java:238)
        at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:379)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:596)

        at org.apache.hadoop.ipc.Client.call(Client.java:482)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:184)
        at org.apache.hadoop.dfs.$Proxy1.open(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at org.apache.hadoop.dfs.$Proxy1.open(Unknown Source)
        at 
org.apache.hadoop.dfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:848)
        at 
org.apache.hadoop.dfs.DFSClient$DFSInputStream.<init>(DFSClient.java:840)
        at org.apache.hadoop.dfs.DFSClient.open(DFSClient.java:285)
        at 
org.apache.hadoop.dfs.DistributedFileSystem.open(DistributedFileSystem.java:114)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:244)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.copy(CopyFiles.java:289)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:367)
        at 
org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:218)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
        at 
org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760)
{{noformat}}

All the directories were created successfully- so the src file list is readable 
at the destination- but none of the files could be opened at src.

The trace through o.a.h.u.CopyFiles doesn't match what's in the repository, 
though. Is this running with any custom patches?
  
> distcp between two clusters does not work if it is run on the target cluster
> ----------------------------------------------------------------------------
>
>                 Key: HADOOP-2129
>                 URL: https://issues.apache.org/jira/browse/HADOOP-2129
>             Project: Hadoop
>          Issue Type: Bug
>          Components: util
>    Affects Versions: 0.16.0
>         Environment: Nightly build: 
> http://hadoopqa.yst.corp.yahoo.com:8080/hudson/job/Hadoop-LinuxTest/718/
> With patches for HADOOP-2033 and HADOOP-2048.
>            Reporter: Murtaza A. Basrai
>            Assignee: Chris Douglas
>            Priority: Critical
>
> I am trying to copy a directory (~100k files, ~500GB) between two clusters A 
> and B (~70 nodes), using a command like:
> hadoop distcp -log /logdir hdfs://namenode-of-A:8600/srcdir 
> hdfs://namenode-of-B:8600/targetdir
> I tried 4 ways of doing it:
> 1) Copy from A to B, by running distcp on A
> 2) Copy from A to B, by running distcp on B
> 3) Copy from B to A, by running distcp on B
> 4) Copy from B to A, by running distcp on A
> Invocations 1 and 3 succeeded, but 2 and 4 failed.
> I got a lot of errors of the type below:
> 07/10/30 20:52:11 INFO mapred.JobClient: Running job: job_200710180049_0115
> 07/10/30 20:52:12 INFO mapred.JobClient:  map 0% reduce 0%
> 07/10/30 20:54:41 INFO mapred.JobClient:  map 1% reduce 0%
> 07/10/30 20:56:52 INFO mapred.JobClient:  map 2% reduce 0%
> 07/10/30 20:57:41 INFO mapred.JobClient: Task Id : 
> task_200710180049_0115_m_000184_0, Status : FAILED
> java.io.IOException: Some copies could not complete. See log for details.
>         at 
> org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.close(CopyFiles.java:407)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:53)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
>         at 
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760)
> followed by the job failing:
> 07/10/30 22:07:41 INFO mapred.JobClient:  map 99% reduce 100%
> Copy failed: java.io.IOException: Job failed!
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:688)
>         at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:481)
>         at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:555)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:54)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:67)
>         at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:566)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to