Try increasing "DataNode volumes failure toleration" dfs.datanode.failed.volumes.tolerated
http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml Thanks, Alejandro On Mon, Nov 17, 2014 at 7:42 AM, SSamineni <[email protected]> wrote: > I am trying to install hadoop through ambari, but host check failing with > error"Not enough disk space on host (). A minimum of 2GB is required for " > is there a way to bypass disk check ? > > -- [image: Hortonworks, Inc.] <http://hortonworks.com/> *Alejandro Fernandez <[email protected]>**Engineering - Ambari* 786.303.7149 -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.
