Once you updated the configuration is the datanode, restarted? Check if the 
datanode log indicated that it was able to setup the new volume. 



>________________________________
> From: Hamed Ghavamnia <ghavamni...@gmail.com>
>To: hdfs-user@hadoop.apache.org 
>Sent: Sunday, January 1, 2012 11:33 AM
>Subject: HDFS Datanode Capacity
> 
>
>Hi,
>I've been searching on how to configure the maximum capacity of a datanode. 
>I've added big volumes to one of my datanodes, but the configured capacity 
>doesn't get bigger than the default 5GB. If I want a datanode with 100GB of 
>capacity, I have to add 20 directories, each having 5GB so the maximum 
>capacity reaches 100. Is there anywhere this can be set? Can different 
>datanodes have different capacities?
>
>Also it seems like the dfs.datanode.du.reserved doesn't work either, because 
>I've set it to zero, but it still leaves 50% of the free space for non-dfs 
>usage.
>
>Thanks,
>Hamed
>
>P.S. This is my first message in the mailing list, so if I have to follow any 
>rules for sending emails, I'll be thankful if you let me know. :)
>
>
>

Reply via email to