Is there any additional configuration needed to run against S3 besides
these instructions?
http://wiki.apache.org/hadoop/AmazonS3
Following the instructions on that page, when I try to run "start-
dfs.sh" I see the following exception in the logs:
2008-04-04 17:03:31,345 ERROR org.apache.hadoop.dfs.NameNode:
java.lang.IllegalArgumentException: port out of range:-1
at java.net.InetSocketAddress.<init>(InetSocketAddress.java:118)
at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:125)
at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:119)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:176)
at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:162)
at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846)
at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855)
The node fails to start and exits after this exception is thrown.
The bucket name I am using is of the format 'myuniqueprefix-hadoop'.
I am using Hadoop 0.16.2 on OSX. I didn't see any Jira issues for
this or any resolution in previous email threads. Is there a
workaround for this, or have I missed some necessary step in
configuration?
Thanks,
Craig B