Thanks Henning for the information.
Roberta *From:* Henning Kropp [mailto:[email protected]] *Sent:* Tuesday, March 22, 2016 2:22 PM *To:* [email protected] *Subject:* Re: Trying to create hbase tables after enabling Kerberos with Ambari Roberta, please try this resource https://steveloughran.gitbooks.io/kerberos_and_hadoop/content/ Hope it helps. Regards, Henning Am 22/03/16 um 20:49 schrieb Roberta Marton: As I work more with Kerberos, I have questions that I cannot seem to figure out from the documentation and scanning the internet. Maybe you can answer them. >From Ambari documentation: “Each service and sub-service in Hadoop must have its own principal. A principal name in a given realm consists of a primary name and an instance name, which in this case is the FQDN of the host that runs that service. As services do not login with a password to acquire their tickets, their principal's authentication credentials are stored in a keytab file, which is extracted from the Kerberos database and stored locally with the service principal on the service component host.” As part of enabling Kerberos, Ambari creates all these service principals and keytabs. So my question is, how are tickets managed between the Hadoop services? For example, HBase needs to talk to HDFS to write some data. If I instigate this request, does HBase send my ticket to services, like HDFS, or does it intercept the request and send its own ticket to HDFS to manage the request. How does HBase (and other Hadoop services) manage their own ticket renewal and expiration? Do they use a thread to automatically renew the ticket like suggested in many forums? What happens if the ticket expires in the middle of a request? Is there code in each service to determine that a ticket is about to expire, and perform a kinit to create a new ticket and send it seamlessly down the line? Regards, Roberta *From:* Robert Levas [mailto:[email protected]] *Sent:* Tuesday, March 22, 2016 6:45 AM *To:* [email protected] *Subject:* Re: Trying to create hbase tables after enabling Kerberos with Ambari Henning… I didn’t know about that hadoop command. This is awesome. Thanks! hadoop org.apache.hadoop.security.HadoopKerberosName [email protected] Rob *From: *Henning Kropp <[email protected]> *Reply-To: *"[email protected]" <[email protected]> *Date: *Monday, March 21, 2016 at 5:49 PM *To: *"[email protected]" <[email protected]> *Subject: *Re: Trying to create hbase tables after enabling Kerberos with Ambari Hi, what Robert suggested sounds to me exactly what you would need. It would help if you could provide your auth_to_local setting and the output of hbase> whoami Another way to test your auth_to_locals setting would be to execute: % hadoop org.apache.hadoop.security.HadoopKerberosName [email protected] Please be aware that the rules are applied in order, so it is important to have the rule from Robert before the default rule. A more simple rule could also be: RULE:[1:$1@$0]([email protected])s/.*/trafodion/ The above rule will only work for this principal/user. Put it as the first line of your auth to local and use HadoopKerberosName to test if it is working. Regards, Henning Am 21/03/16 um 21:40 schrieb Roberta Marton: Thanks for your suggestion. My property settings did have the second rule defined but not the first. However, it did not seem to help. I tried setting the rule several other ways but nothing seems to work. I still get the same behavior. Roberta *From:* Robert Levas [mailto:[email protected]] *Sent:* Monday, March 21, 2016 11:21 AM *To:* [email protected] *Subject:* Re: Trying to create hbase tables after enabling Kerberos with Ambari Hi Roberta… It seems like you need an auth-to-local run set up to translate [email protected] to trafodion. To can do this by editing the hadoop.security.auth_to_local property under HDFS->Configs->Advanced->Advanced core-site. Adding the following rule should do the trick: RULE:[1:$1@$0](.*[email protected])s/-robertaCluster@.*// You will need to add this rule to the ruleset before/above less general rules like RULE:[1:$1@$0](.*@TRAFKDC.COM)s/@.*// After adding this rule, save the config and restart the recommended services. I hope this helps, Rob *From: *Roberta Marton <[email protected]> *Reply-To: *"[email protected]" <[email protected]> *Date: *Monday, March 21, 2016 at 2:08 PM *To: *"[email protected]" <[email protected]> *Subject: *Trying to create hbase tables after enabling Kerberos with Ambari I am trying to install Kerberos on top of my Hortonworks installation. I have tried this with both versions 2.2 and 2.3 and get similar results. After I enable Kerberos, I create a Linux user called trafodion and grant this user all HBase permissions. I connect as trafodion but get permission errors when I try to create a table. Details: [trafodion@myhost ~]$ whoami trafodion [trafodion@myhost ~]$ klist Ticket cache: FILE:/tmp/krb5cc_503 <FILE:///\\%5C%5Ctmp%5Ckrb5cc_503> Default principal: [email protected] Valid starting Expires Service principal 03/21/16 16:39:33 03/22/16 16:39:33 krbtgt/[email protected] renew until 03/21/16 16:39:33 hbase shell hbase(main):002:0> whoami [email protected](auth:KERBEROS)OIw 2016-03-21 17:06:22,925 WARN [main] security.UserGroupInformation: No groups available for user trafodion-robertaCluster hbase(main):003:0> user_permission User Table,Family,Qualifier:Permission trafodion hbase:acl,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN] ambari-qa hbase:acl,,: [Permission: actions=READ,WRITE,EXEC,CREATE,ADMIN] 2 row(s) in 1.7630 seconds hbase(main):004:0> create 't1', 'f1', 'f2' ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'trafodion-robertaCluster' (global, action=CREATE) I am able to perform ‘user_permission’ but not ‘create’ Any suggestion on how to proceed? Roberta
