How can I tell my job to include all the subdirectories and their content of a certain path?

My directory structure is as follows: logs/{YEAR}/{MONTH}/{DAY} and I tried setting my input path to 'logs/' using FileInputFormat.addInputPath however I keep receiving the following error:

ava.io.FileNotFoundException: File does not exist: /user/root/logs/2011/01
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1586)
        at 
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.<init>(DFSClient.java:1577)
        at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:428)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:187)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:456)
        at 
org.apache.hadoop.mapreduce.lib.input.LineRecordReader.initialize(LineRecordReader.java:67)
        at 
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:450)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:240)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
        at org.apache.hadoop.mapred.Child.main(Child.java:234


Do my directories/directory contents need to be in any particular format?

Thanks

Reply via email to