Hi everyone
I am getting this error when i run TestDFSIO. the job actually finishes
successfully. ( according to jobtracker at least ) but this is what i
get on the console :
crawler@d1r2n2:/hadoop$ bin/hadoop jar hadoop-test-1.1.1.jar TestDFSIO
-write -nrFiles 10 -fileSize 1000
TestDFSIO.0.0.4
13/04/19 17:23:43 INFO fs.TestDFSIO: nrFiles = 10
13/04/19 17:23:43 INFO fs.TestDFSIO: fileSize (MB) = 1000
13/04/19 17:23:43 INFO fs.TestDFSIO: bufferSize = 1000000
13/04/19 17:23:43 INFO fs.TestDFSIO: creating control file: 1000 mega
bytes, 10 files
13/04/19 17:23:44 INFO fs.TestDFSIO: created control files for: 10 files
13/04/19 17:23:44 INFO mapred.FileInputFormat: Total input paths to
process : 10
13/04/19 17:23:44 INFO mapred.JobClient: Running job: job_201304191712_0002
13/04/19 17:23:45 INFO mapred.JobClient: map 0% reduce 0%
13/04/19 17:24:06 INFO mapred.JobClient: map 20% reduce 0%
13/04/19 17:24:07 INFO mapred.JobClient: map 30% reduce 0%
13/04/19 17:24:09 INFO mapred.JobClient: map 50% reduce 0%
13/04/19 17:24:11 INFO mapred.JobClient: map 60% reduce 0%
13/04/19 17:24:12 INFO mapred.JobClient: map 90% reduce 0%
13/04/19 17:24:13 INFO mapred.JobClient: map 100% reduce 0%
13/04/19 17:24:21 INFO mapred.JobClient: map 100% reduce 33%
13/04/19 17:24:22 INFO mapred.JobClient: map 100% reduce 100%
13/04/19 17:24:23 INFO mapred.JobClient: Job complete: job_201304191712_0002
13/04/19 17:24:23 INFO mapred.JobClient: Counters: 33
13/04/19 17:24:23 INFO mapred.JobClient: Job Counters
13/04/19 17:24:23 INFO mapred.JobClient: Launched reduce tasks=1
13/04/19 17:24:23 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=210932
13/04/19 17:24:23 INFO mapred.JobClient: Total time spent by all
reduces waiting after reserving slots (ms)=0
13/04/19 17:24:23 INFO mapred.JobClient: Total time spent by all
maps waiting after reserving slots (ms)=0
13/04/19 17:24:23 INFO mapred.JobClient: Rack-local map tasks=2
13/04/19 17:24:23 INFO mapred.JobClient: Launched map tasks=10
13/04/19 17:24:23 INFO mapred.JobClient: Data-local map tasks=8
13/04/19 17:24:23 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=8650
13/04/19 17:24:23 INFO mapred.JobClient: File Input Format Counters
13/04/19 17:24:23 INFO mapred.JobClient: Bytes Read=1120
13/04/19 17:24:23 INFO mapred.JobClient: SkippingTaskCounters
13/04/19 17:24:23 INFO mapred.JobClient: MapProcessedRecords=10
13/04/19 17:24:23 INFO mapred.JobClient: ReduceProcessedGroups=5
13/04/19 17:24:23 INFO mapred.JobClient: File Output Format Counters
13/04/19 17:24:23 INFO mapred.JobClient: Bytes Written=79
13/04/19 17:24:23 INFO mapred.JobClient: FileSystemCounters
13/04/19 17:24:23 INFO mapred.JobClient: FILE_BYTES_READ=871
13/04/19 17:24:23 INFO mapred.JobClient: HDFS_BYTES_READ=2330
13/04/19 17:24:23 INFO mapred.JobClient: FILE_BYTES_WRITTEN=272508
13/04/19 17:24:23 INFO mapred.JobClient: HDFS_BYTES_WRITTEN=10485760079
13/04/19 17:24:23 INFO mapred.JobClient: Map-Reduce Framework
13/04/19 17:24:23 INFO mapred.JobClient: Map output materialized
bytes=925
13/04/19 17:24:23 INFO mapred.JobClient: Map input records=10
13/04/19 17:24:23 INFO mapred.JobClient: Reduce shuffle bytes=925
13/04/19 17:24:23 INFO mapred.JobClient: Spilled Records=100
13/04/19 17:24:23 INFO mapred.JobClient: Map output bytes=765
13/04/19 17:24:23 INFO mapred.JobClient: Total committed heap usage
(bytes)=7996702720
13/04/19 17:24:23 INFO mapred.JobClient: CPU time spent (ms)=104520
13/04/19 17:24:23 INFO mapred.JobClient: Map input bytes=260
13/04/19 17:24:23 INFO mapred.JobClient: SPLIT_RAW_BYTES=1210
13/04/19 17:24:23 INFO mapred.JobClient: Combine input records=0
13/04/19 17:24:23 INFO mapred.JobClient: Reduce input records=50
13/04/19 17:24:23 INFO mapred.JobClient: Reduce input groups=5
13/04/19 17:24:23 INFO mapred.JobClient: Combine output records=0
13/04/19 17:24:23 INFO mapred.JobClient: Physical memory (bytes)
snapshot=7111999488
13/04/19 17:24:23 INFO mapred.JobClient: Reduce output records=5
13/04/19 17:24:23 INFO mapred.JobClient: Virtual memory (bytes)
snapshot=28466053120
13/04/19 17:24:23 INFO mapred.JobClient: Map output records=50
java.io.FileNotFoundException: File does not exist:
/benchmarks/TestDFSIO/io_write/part-00000
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.fetchLocatedBlocks(DFSClient.java:1975)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1944)
at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.<init>(DFSClient.java:1936)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:731)
at
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:165)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:427)
at org.apache.hadoop.fs.TestDFSIO.analyzeResult(TestDFSIO.java:339)
at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:462)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:317)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.test.AllTestDriver.main(AllTestDriver.java:81)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
crawler@d1r2n2:/hadoop$ bin/hadoop fs -ls /benchmarks/TestDFSIO/io_write
Found 3 items
-rw-r--r-- 2 crawler supergroup 0 2013-04-19 17:24
/benchmarks/TestDFSIO/io_write/_SUCCESS
drwxr-xr-x - crawler supergroup 0 2013-04-19 17:23
/benchmarks/TestDFSIO/io_write/_logs
-rw-r--r-- 2 crawler supergroup 79 2013-04-19 17:24
/benchmarks/TestDFSIO/io_write/part-00000.deflate
crawler@d1r2n2:/hadoop$
Does anyone have any idea what might be wrong here?