My local environment: single ubuntu 11.10 desktop version, oracle jdk
7.0_04, MIT kerberos 5, apache hadoop-1.0.2.

I am able to get kerberos working, here is my key:
------------------------------------------------------------------------------------------------------------------------------------------
allan@localhost:~/tools/UnlimitedJCEPolicy$ klist -e
Ticket cache: FILE:/tmp/krb5cc_1000
Default principal: allan/admin@LOCALDOMAIN

Valid starting     Expires            Service principal
06/03/12 22:55:30  06/04/12 08:55:30  krbtgt/LOCALDOMAIN@LOCALDOMAIN
renew until 06/10/12 22:55:28, Etype (skey, tkt): aes256-cts-hmac-sha1-96,
aes256-cts-hmac-sha1-96
------------------------------------------------------------------------------------------------------------------------------------------

However, after turning on hadoop security, I am not able to start name
node. I turned on java security debug, here is the debug log and error
message while trying to start NN:
------------------------------------------------------------------------------------------------------------------------------------------
starting namenode, logging to
/usr/local/hadoop-1.0.2/libexec/../logs/hadoop-allan-namenode-localhost.localdomain.out
Config name: /etc/krb5.conf
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
localhost: starting datanode, logging to
/usr/local/hadoop-1.0.2/libexec/../logs/hadoop-allan-datanode-localhost.localdomain.out
localhost: Config name: /etc/krb5.conf
localhost: >>>KinitOptions cache name is /tmp/krb5cc_1000
localhost: >>>DEBUG <CCacheInputStream>  client principal is
allan/admin@LOCALDOMAIN
localhost: >>>DEBUG <CCacheInputStream> server principal is
krbtgt/LOCALDOMAIN@LOCALDOMAIN
localhost: >>>DEBUG <CCacheInputStream> key type: 18
localhost: >>>DEBUG <CCacheInputStream> auth time: Sun Jun 03 22:17:13 PDT
2012
localhost: >>>DEBUG <CCacheInputStream> start time: Sun Jun 03 22:17:18 PDT
2012
localhost: >>>DEBUG <CCacheInputStream> end time: Mon Jun 04 08:17:18 PDT
2012
localhost: >>>DEBUG <CCacheInputStream> renew_till time: Sun Jun 10
22:17:08 PDT 2012
localhost: >>> CCacheInputStream: readFlags()  FORWARDABLE; RENEWABLE;
INITIAL; PRE_AUTH;
localhost: starting secondarynamenode, logging to
/usr/local/hadoop-1.0.2/libexec/../logs/hadoop-allan-secondarynamenode-localhost.localdomain.out
------------------------------------------------------------------------------------------------------------------------------------------

------------------------------------------------------------------------------------------------------------------------------------------
2012-06-03 22:53:02,349 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = localhost.localdomain/127.0.0.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 1.0.2
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0.2 -r
1304954; compiled by 'hortonfo' on Sat Mar 24 23:58:21 UTC 2012
************************************************************/
2012-06-03 22:53:02,488 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
loaded properties from hadoop-metrics2.properties
2012-06-03 22:53:02,499 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2012-06-03 22:53:02,500 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2012-06-03 22:53:02,500 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system
started
2012-06-03 22:53:02,632 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
registered.
2012-06-03 22:53:02,718 ERROR
org.apache.hadoop.hdfs.server.namenode.NameNode: java.io.IOException: Login
failure for allan/admin@LOCALDOMAIN from keytab /etc/krb5kdc/kadm5.keytab
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:602)
at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:263)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:264)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
Caused by: javax.security.auth.login.LoginException: Unable to obtain
password from user

at
com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:852)
at
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:715)
at
com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:580)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:784)
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:203)
at javax.security.auth.login.LoginContext$5.run(LoginContext.java:721)
at javax.security.auth.login.LoginContext$5.run(LoginContext.java:719)
at java.security.AccessController.doPrivileged(Native Method)
at
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:718)
at javax.security.auth.login.LoginContext.login(LoginContext.java:590)
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:593)
... 5 more

2012-06-03 22:53:02,719 INFO
org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
------------------------------------------------------------------------------------------------------------------------------------------

I am following
https://ccp.cloudera.com/display/CDHDOC/Configuring+Hadoop+Security+in+CDH3
instruction
to configure key/principals and hadoop config files, except I am not using
hdfs and mapred linux users, but simply use a single user "allan" in all my
config files. Is that a problem just for getting NN started and filesystem
shell command work?

Also, I found https://issues.apache.org/jira/browse/HADOOP-6947 reports
same error message, though the reason for that is different. So I also
tried:
1) apache hadoop-0.23.0 (the version with HADOOP-6947's patch) and CDH3
Beta4 (the one with security feature)
2) both jdk 6 and jdk 7 with "Java Cryptography Extension (JCE) Unlimited
Strength Jurisdiction Policy File" installed to get AES-256 encryption
works.

Still no luck. Can you please help me figure out what is wrong with my
setting?

Thanks,
Hailun Yan

Reply via email to