hadoop fsck mytext.txt -files -locations -blocks
I expect something like a tag which is attached to each block (say block X)
that shows the position of the replicated block of X. The method you mentioned
is a user level task. Am I right?
Regards,
Mahmood
ClientProtocol namenode = DFSClient.createNamenode(conf);
HdfsFileStatus hfs = namenode.getFileInfo(your_hdfs_file_name);
LocatedBlocks lbs = namenode.getBlockLocations(your_hdfs_file_name, 0,
hfs.getLen());
for (LocatedBlock lb : lbs.getLocatedBlocks()) {
DatanodeInfo[] info =
hadoop fsck path -files -blocks -locations –racks
replace path with real path J
From: 一凡 李 [mailto:zhuazhua_...@yahoo.com.cn]
Sent: Tuesday, June 04, 2013 12:49 PM
To: user@hadoop.apache.org
Subject: how to locate the replicas of a file in HDFS?
Hi,
Could you tell me how to
Try this command
hadoop fsck file path -files -blocks
On Tue, Jun 4, 2013 at 3:41 PM, zangxiangyu zangxian...@qiyi.com wrote:
hadoop fsck * path* -files -blocks -locations –racks
** **
replace path with real path J
** **
*From:* 一凡 李 [mailto:zhuazhua_...@yahoo.com.cn]
Hi,
Could you tell me how to locate where store each replica of a file in HDFS?
Correctly speaking, if I create a file in HDFS(replicate factor:3),how to find
the DataNodes which store its each block and replicas?
Best Wishes,
Yifan
hadoop fsck mytext.txt -files -locations -blocks
Thanks,
Rahul
On Tue, Jun 4, 2013 at 10:19 AM, 一凡 李 zhuazhua_...@yahoo.com.cn wrote:
Hi,
Could you tell me how to locate where store each replica of a file in HDFS?
Correctly speaking, if I create a file in HDFS(replicate factor:3),how to