Hello!




I'm trying to perform a read test of HDFS files through libhdfs using
the hadoop-0.18.2/src/c++/libhdfs/hdfs_read.c test program. Creating
the files succeeds but reading them fails.



I create two 1MB local files with hdfs_write.c and then I put it under hdfs 
using hadoop fs -put. The files go under dfs.data.dir as:

hdfs://server:port/dfs.data.dir/file1 and 


hdfs://server:port/dfs.data.dir/file2



Then I try to read it back with hdfs_read and measure the time it takes but I 
get the following exceptions:



Reading file:///////home/sony/hadoop/dfs/blocks/file1 1MB

Exception in thread "main" java.lang.IllegalArgumentException: Wrong
FS: hdfs://myserver.com:23000/home/sony/hadoop/dfs/blocks/file1,
expected: file:///

        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:320)

        at 
org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:52)

Call to 
org.apache.hadoop.fs.FileSystem::open((Lorg/apache/hadoop/fs/Path;I)Lorg/apache/hadoop/fs/FSDataInputStream;)
 failed!

hdfs_read.c: Failed to open 
hdfs://myserver.com:23000/home/sony/hadoop/dfs/blocks/file1 for writing!

..

Reading file:////////home/sony/hadoop/dfs/blocks/file2 1MB

Exception in thread "main" java.lang.IllegalArgumentException: Wrong
FS: hdfs://myserver.com:23000/home/sony/hadoop/dfs/blocks/file2,
expected: file:///

        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:320)

        at 
org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:52)

Call to 
org.apache.hadoop.fs.FileSystem::open((Lorg/apache/hadoop/fs/Path;I)Lorg/apache/hadoop/fs/FSDataInputStream;)
 failed!

hdfs_read.c: Failed to open 
hdfs://myserver.com:23000/home/sony/hadoop/dfs/blocks/file2 for writing!



Do I use an incorrect URI? What can be the problem?



Cheers,

Tamas


      

Reply via email to