Re: problems with Hadoop instalation

2014-10-29 Thread Bhooshan Mogal
HI David, JAVA_HOME should point to the java installation directory. Typically, this directory will contain a subdirectory called 'bin'. Hadoop tries to find the java command in $JAVA_HOME/bin/java. It is likely that /usr/bin/java is a symlink to some other file. If you do an ls -l

Re: conf.get(dfs.data.dir) return null when hdfs-site.xml doesn't set it explicitly

2014-09-09 Thread Bhooshan Mogal
object is built from Hadoop cluster node xml files, basically the resource manager node core-site.xml and mapred-site.xml and yarn-site.xml. Am I correct? TIA Susheel Kumar On 9/9/14, Bhooshan Mogal bhooshan.mo...@gmail.com wrote: Hi Demai, conf = new Configuration() will create

Re: conf.get(dfs.data.dir) return null when hdfs-site.xml doesn't set it explicitly

2014-09-08 Thread Bhooshan Mogal
Hi Demai, When you read a property from the conf object, it will only have a value if the conf object contains that property. In your case, you created the conf object as new Configuration() -- adds core-default and core-site.xml. Then you added site.xmls (hdfs-site.xml and core-site.xml) from

Re: conf.get(dfs.data.dir) return null when hdfs-site.xml doesn't set it explicitly

2014-09-08 Thread Bhooshan Mogal
= new Configuration() to connect to hdfs and did other operations, shouldn't I be able to retrieve the configuration variables? Thanks Demai On Mon, Sep 8, 2014 at 2:40 PM, Bhooshan Mogal bhooshan.mo...@gmail.com wrote: Hi Demai, When you read a property from the conf object

Re: Pivotal-HD Hadoop installation

2013-12-23 Thread Bhooshan Mogal
As Nitin suggested, this forum is for Apache Hadoop questions.  Please use  http://ask.gopivotal.com/hc/communities/public/topics/200053048-Pivotal-HD-Forum for questions about Pivotal HD. Thanks, Bhooshan. Sent via the Samsung Galaxy S™III, an ATT 4G LTE smartphone Original message

Accessing a secure cluster from another

2013-10-08 Thread Bhooshan Mogal
Hi, What's the recommended way to access a secure cluster from another (both are configured to use the same kerberos realm)? For example, can I run a map-reduce job with input on a secure cluster and output on another? Do I have to change any configurations or add specific credentials for the