Hello,

I want to measure the performance of HDFS on a cluster. There are 16 nodes in the cluster and each node has 12GB memory. In order to reduce the impact of caching of the file system, each file written by TestDFSIO is 10GB. The replication factor is 1. Then I got the errors such as org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException and org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException. Could anyone tell me what the errors mean?

Thanks,
Da

zhengda@occalit27:~/hadoop-0.20.2$ bin/hadoop jar hadoop-0.20.2-test.jar TestDFSIO -write -fileSize 10000 -nrFiles 48
TestFDSIO.0.0.4
11/02/01 12:05:29 INFO mapred.FileInputFormat: nrFiles = 48
11/02/01 12:05:29 INFO mapred.FileInputFormat: fileSize (MB) = 10000
11/02/01 12:05:29 INFO mapred.FileInputFormat: bufferSize = 1000000
11/02/01 12:05:30 INFO mapred.FileInputFormat: creating control file: 10000 mega bytes, 48 files 11/02/01 12:05:40 INFO mapred.FileInputFormat: created control files for: 48 files 11/02/01 12:05:40 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 11/02/01 12:05:40 INFO mapred.FileInputFormat: Total input paths to process : 48
11/02/01 12:05:40 INFO mapred.JobClient: Running job: job_201102011156_0004
11/02/01 12:05:41 INFO mapred.JobClient:  map 0% reduce 0%
11/02/01 12:12:25 INFO mapred.JobClient:  map 2% reduce 0%
11/02/01 12:12:28 INFO mapred.JobClient:  map 4% reduce 0%
11/02/01 12:12:34 INFO mapred.JobClient:  map 8% reduce 0%
11/02/01 12:12:40 INFO mapred.JobClient:  map 10% reduce 0%
11/02/01 12:12:43 INFO mapred.JobClient:  map 16% reduce 0%
11/02/01 12:12:46 INFO mapred.JobClient:  map 24% reduce 0%
11/02/01 12:12:49 INFO mapred.JobClient:  map 29% reduce 5%
11/02/01 12:12:52 INFO mapred.JobClient:  map 31% reduce 5%
11/02/01 12:12:58 INFO mapred.JobClient:  map 31% reduce 9%
11/02/01 12:13:04 INFO mapred.JobClient:  map 33% reduce 10%
11/02/01 12:13:07 INFO mapred.JobClient:  map 37% reduce 10%
11/02/01 12:13:13 INFO mapred.JobClient:  map 37% reduce 11%
11/02/01 12:13:19 INFO mapred.JobClient:  map 41% reduce 12%
11/02/01 12:13:22 INFO mapred.JobClient:  map 47% reduce 12%
11/02/01 12:13:25 INFO mapred.JobClient:  map 52% reduce 12%
11/02/01 12:13:28 INFO mapred.JobClient:  map 54% reduce 13%
11/02/01 12:13:31 INFO mapred.JobClient:  map 56% reduce 13%
11/02/01 12:13:34 INFO mapred.JobClient:  map 56% reduce 18%
11/02/01 12:13:37 INFO mapred.JobClient:  map 58% reduce 18%
11/02/01 12:13:43 INFO mapred.JobClient:  map 60% reduce 18%
11/02/01 12:13:46 INFO mapred.JobClient:  map 62% reduce 18%
11/02/01 12:13:49 INFO mapred.JobClient:  map 62% reduce 20%
11/02/01 12:14:10 INFO mapred.JobClient:  map 64% reduce 20%
11/02/01 12:14:19 INFO mapred.JobClient:  map 64% reduce 21%
11/02/01 12:14:25 INFO mapred.JobClient:  map 66% reduce 21%
11/02/01 12:14:28 INFO mapred.JobClient:  map 68% reduce 21%
11/02/01 12:14:34 INFO mapred.JobClient:  map 70% reduce 22%
11/02/01 12:14:40 INFO mapred.JobClient:  map 81% reduce 22%
11/02/01 12:14:44 INFO mapred.JobClient:  map 81% reduce 23%
11/02/01 12:14:47 INFO mapred.JobClient:  map 85% reduce 23%
11/02/01 12:14:50 INFO mapred.JobClient:  map 85% reduce 27%
11/02/01 12:14:56 INFO mapred.JobClient:  map 87% reduce 27%
11/02/01 12:14:59 INFO mapred.JobClient:  map 87% reduce 28%
11/02/01 12:15:05 INFO mapred.JobClient:  map 87% reduce 29%
11/02/01 12:15:14 INFO mapred.JobClient:  map 89% reduce 29%
11/02/01 12:15:16 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000011_0, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException: Not replicated yet:/benchmarks/TestDFSIO/io_data/test_io_21 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1257) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:422)
    at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.addBlock(Unknown Source)
    at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2937) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2819) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

attempt_201102011156_0004_m_000011_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient). attempt_201102011156_0004_m_000011_0: log4j:WARN Please initialize the log4j system properly. 11/02/01 12:15:16 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000026_0, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException: Not replicated yet:/benchmarks/TestDFSIO/io_data/test_io_36 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1257) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:422)
    at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.addBlock(Unknown Source)
    at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2937) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2819) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)

attempt_201102011156_0004_m_000026_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient). attempt_201102011156_0004_m_000026_0: log4j:WARN Please initialize the log4j system properly.
11/02/01 12:15:56 INFO mapred.JobClient:  map 91% reduce 29%
11/02/01 12:16:04 INFO mapred.JobClient:  map 91% reduce 30%
11/02/01 12:16:08 INFO mapred.JobClient:  map 93% reduce 30%
11/02/01 12:16:11 INFO mapred.JobClient:  map 95% reduce 30%
11/02/01 12:16:19 INFO mapred.JobClient:  map 95% reduce 31%
11/02/01 12:17:55 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000011_1, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /benchmarks/TestDFSIO/io_data/test_io_21 for DFSClient_attempt_201102011156_0004_m_000005_0 on client 67.58.56.96, because this file is already being created by DFSClient_attempt_201102011156_0004_m_000011_0 on 67.58.56.75 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1068) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:981) at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
    at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2707)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:492)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:427)
    at org.apache.hadoop.fs.TestDFSIO$WriteMapper.doIO(TestDFSIO.java:193)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:124)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:42)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)

11/02/01 12:17:58 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000026_1, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /benchmarks/TestDFSIO/io_data/test_io_36 for DFSClient_attempt_201102011156_0004_m_000006_0 on client 67.58.56.96, because this file is already being created by DFSClient_attempt_201102011156_0004_m_000049_0 on 67.58.56.75 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1068) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:981) at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
    at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2707)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:492)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:427)
    at org.apache.hadoop.fs.TestDFSIO$WriteMapper.doIO(TestDFSIO.java:193)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:124)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:42)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)







11/02/01 12:20:22 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000011_2, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /benchmarks/TestDFSIO/io_data/test_io_21 for DFSClient_attempt_201102011156_0004_m_000022_1 on client 67.58.56.74, because this file is already being created by DFSClient_attempt_201102011156_0004_m_000011_0 on 67.58.56.75 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1068) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:981) at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
    at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2707)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:492)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:427)
    at org.apache.hadoop.fs.TestDFSIO$WriteMapper.doIO(TestDFSIO.java:193)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:124)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:42)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)

11/02/01 12:20:25 INFO mapred.JobClient: Task Id : attempt_201102011156_0004_m_000026_2, Status : FAILED org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /benchmarks/TestDFSIO/io_data/test_io_36 for DFSClient_attempt_201102011156_0004_m_000043_0 on client 67.58.56.89, because this file is already being created by DFSClient_attempt_201102011156_0004_m_000049_0 on 67.58.56.75 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1068) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:981) at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:377)
    at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

    at org.apache.hadoop.ipc.Client.call(Client.java:740)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy1.create(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
    at $Proxy1.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2707)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:492)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:195)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:465)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:427)
    at org.apache.hadoop.fs.TestDFSIO$WriteMapper.doIO(TestDFSIO.java:193)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:124)
    at org.apache.hadoop.fs.IOMapperBase.map(IOMapperBase.java:42)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
    at org.apache.hadoop.mapred.Child.main(Child.java:170)

11/02/01 12:23:04 INFO mapred.JobClient: Job complete: job_201102011156_0004
11/02/01 12:23:04 INFO mapred.JobClient: Counters: 15
11/02/01 12:23:04 INFO mapred.JobClient:   Job Counters
11/02/01 12:23:04 INFO mapred.JobClient:     Launched reduce tasks=2
11/02/01 12:23:04 INFO mapred.JobClient:     Rack-local map tasks=48
11/02/01 12:23:04 INFO mapred.JobClient:     Launched map tasks=81
11/02/01 12:23:04 INFO mapred.JobClient:     Data-local map tasks=33
11/02/01 12:23:04 INFO mapred.JobClient:     Failed map tasks=1
11/02/01 12:23:04 INFO mapred.JobClient:   FileSystemCounters
11/02/01 12:23:04 INFO mapred.JobClient:     HDFS_BYTES_READ=5234
11/02/01 12:23:04 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=6271
11/02/01 12:23:04 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=482344960000
11/02/01 12:23:04 INFO mapred.JobClient:   Map-Reduce Framework
11/02/01 12:23:04 INFO mapred.JobClient:     Combine output records=0
11/02/01 12:23:04 INFO mapred.JobClient:     Map input records=46
11/02/01 12:23:04 INFO mapred.JobClient:     Spilled Records=230
11/02/01 12:23:04 INFO mapred.JobClient:     Map output bytes=4063
11/02/01 12:23:04 INFO mapred.JobClient:     Map input bytes=1278
11/02/01 12:23:04 INFO mapred.JobClient:     Combine input records=0
11/02/01 12:23:04 INFO mapred.JobClient:     Map output records=230
java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252)
    at org.apache.hadoop.fs.TestDFSIO.runIOTest(TestDFSIO.java:236)
    at org.apache.hadoop.fs.TestDFSIO.writeTest(TestDFSIO.java:218)
    at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:354)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.test.AllTestDriver.main(AllTestDriver.java:81)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Reply via email to