In recent versions of Mac OS X, a default Hadoop configuration such as from
Homebrew<https://github.com/Homebrew/homebrew/blob/a49c2fa1244032f12ba0e1121a934bcecfe10182/Library/Formula/hadoop.rb>raises
errors on some operations:

$ hadoop version
Hadoop 1.2.1
Subversion
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r
1503152
Compiled by mattf on Mon Jul 22 15:23:09 PDT 2013
>From source with checksum 6923c86528809c4e7e6f493b6b413a9a
This command was run using
/usr/local/Cellar/hadoop/1.2.1/libexec/hadoop-core-1.2.1.jar

When querying HDFS, Hadoop reports a realm error:

$ hadoop fsck / -files -bytes
2014-03-12 10:55:48.330 java[12749:1703] Unable to load realm info from
SCDynamicStore
...

This happens because realms are not configured by default. Setting
HADOOP_OPTS in hadoop-env.sh fixes the realm error:

export HADOOP_OPTS="-Djava.security.krb5.realm=-Djava.security.krb5.kdc="

$ hadoop fsck / -files -bytes
...

Another error occurs when attempting to start a namenode:

$ hadoop namenode
14/03/12 11:25:25 ERROR namenode.NameNode:
java.lang.IllegalArgumentException: Does not contain a valid host:port
authority: file:///

This is because the core-site.xml bundled with Hadoop fails to specify an
address. Adding an address property fixes this:

  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:8020</value>
  </property>

$ hadoop namenode
...
14/03/12 11:27:53 INFO ipc.Server: IPC Server listener on 8020: starting

These manual steps to correct the configuration might scare off newbies. In
the future, could Hadoop come with better defaults, for a better out of the
box experience for newbies?

-- 
Cheers,

Andrew Pennebaker
apenneba...@42six.com

Reply via email to