Hello Tien ,

There is tool in Hadoop DFS i.e fsck . I hope this will help you and
serve your purpose very well.

For e.g:
$HADOOP_HOME/bin/hadoop fsck <filename/directorie path> -files -blocks
-locations 

The above tool will display the blocks/chunks of files , locations where
this blocks/chunks of files are located. Also it will display other
useful information for files and directories. 

For more information on fsck , just refer this URL :
http://hadoop.apache.org/core/docs/r0.19.0/hdfs_user_guide.html#fsck 


Thanks ,
---
Peeyush

On Fri, 2009-01-23 at 15:24 -0800, tienduc_dinh wrote:

> hi everyone,
> 
> I got a question, maybe you can help me.
> 
> - how can we get the meta data of a file on HDFS ? 
> 
> For example:  If I have a file with e.g. 2 GB on HDFS, this file is split
> into many chunks and these chunks are distributed on many nodes. Is there
> any trick to know, which chunks belong to that file ?
> 
> Any help will be appreciated, thanks lots.
> 
> Tien Duc Dinh

Reply via email to