Hi all,

I'm a rookie to HDFS. Here is just a quick question, suppose I have a big file 
stored in HDFS, is there any way to generate a file containing all information 
about blocks belong to this file? 
For example list of records with format of "block_id, length, offset, hosts[], 
local/path/to/this/block"?

The purpose is to enable programs to only access blocks on the same node, to 
utilize block locality.

I can retrieve most information using getFileBlockLocations() but I didn't find 
how to gather information about the local path.

Thanks,
Yuduo

Reply via email to