Hi Mohit, If you're using an hdfs url, you'll want to use hdfs://host:port/path/to/file, e.g. hdfs://master:9000/user/hadoop/foo.csv.
The file:/ pattern is right for files on the local machine, but I'm not sure it can read from just one machine, it might need to be available on all the other nodes as well - others can probably comment on that. HDFS is usually easier to work with for this reason. Thanks, Bryn On Thu, Feb 20, 2014 at 4:25 PM, Mohit Singh <[email protected]> wrote: > Hi, > I am trying to read a file from localdisk.. > And just counting number of lines in that file.. > But I see this error: > Task 2.0:225 failed 4 times (most recent failure: Exception failure: > java.io.FileNotFoundException: File > *file:/*home/hadoop/data/backup/data/domain/domainz0 > does not exist.) > > But the file is there.. > Though the file:/ doesnt look right? > Also, if i try to read from hdfs: > Incomplete HDFS URI, no host: hdfs:/user/hadoop/foo.csv > Shouldnt it be hdfs:///user/hadoop/foo.csv > Am I missing something? > > Thanks > > -- > Mohit > > "When you want success as badly as you want the air, then you will get it. > There is no other secret of success." > -Socrates >
