RE: Trying to create hbase tables after enabling Kerberos with Ambari

2016-03-22 Thread Roberta Marton
Thanks Henning for the information.



   Roberta



*From:* Henning Kropp [mailto:hkr...@microlution.de]
*Sent:* Tuesday, March 22, 2016 2:22 PM
*To:* user@ambari.apache.org
*Subject:* Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Roberta,

please try this resource
https://steveloughran.gitbooks.io/kerberos_and_hadoop/content/

Hope it helps.

Regards,
Henning

Am 22/03/16 um 20:49 schrieb Roberta Marton:

As I work more with Kerberos, I have questions that I cannot seem to figure
out from the documentation and scanning the internet.  Maybe you can answer
them.



>From Ambari documentation:



“Each service and sub-service in Hadoop must have its own principal. A
principal name in a given realm consists of a primary name and an instance
name, which in this case is the FQDN of the host that runs that service. As
services do not login with a password to acquire their tickets, their
principal's authentication credentials are stored in a keytab file, which
is extracted from the Kerberos database and stored locally with the service
principal on the service component host.”



As part of enabling Kerberos, Ambari creates all these service principals
and keytabs.  So my question is, how are tickets managed between the Hadoop
services?  For example, HBase needs to talk to HDFS to write some data.  If
I instigate this request, does HBase send my ticket to services, like HDFS,
or does it intercept the request and send its own ticket to HDFS to manage
the request.



How does HBase (and other Hadoop services) manage their own ticket renewal
and expiration?  Do they use a thread to automatically renew the ticket
like suggested in many forums?  What happens if the ticket expires in the
middle of a request?  Is there code in each service to determine that a
ticket is about to expire, and perform a kinit to create a new ticket and
send it seamlessly down the line?



   Regards,

   Roberta



*From:* Robert Levas [mailto:rle...@hortonworks.com]
*Sent:* Tuesday, March 22, 2016 6:45 AM
*To:* user@ambari.apache.org
*Subject:* Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Henning…



I didn’t know about that hadoop command.  This is awesome. Thanks!



hadoop org.apache.hadoop.security.HadoopKerberosName
trafodion-robertaclus...@trafkdc.com



Rob





*From: *Henning Kropp 
*Reply-To: *"user@ambari.apache.org" 
*Date: *Monday, March 21, 2016 at 5:49 PM
*To: *"user@ambari.apache.org" 
*Subject: *Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Hi,

what Robert suggested sounds to me exactly what you would need. It would
help if you could provide your auth_to_local setting and the output of
hbase> whoami

Another way to test your auth_to_locals setting would be to execute:
% hadoop org.apache.hadoop.security.HadoopKerberosName
trafodion-robertaclus...@trafkdc.com

Please be aware that the rules are applied in order, so it is important to
have the rule from Robert before the default rule.

A more simple rule could also be:
RULE:[1:$1@$0](trafidion-robertaclus...@trafkdc.com)s/.*/trafodion/

The above rule will only work for this principal/user. Put it as the first
line of your auth to local and use HadoopKerberosName to test if it is
working.

Regards,
Henning

Am 21/03/16 um 21:40 schrieb Roberta Marton:

Thanks for your suggestion.  My property settings did have the second rule
defined but not the first.

However, it did not seem to help.

I tried setting the rule several other ways but nothing seems to work.  I
still get the same behavior.



   Roberta



*From:* Robert Levas [mailto:rle...@hortonworks.com]
*Sent:* Monday, March 21, 2016 11:21 AM
*To:* user@ambari.apache.org
*Subject:* Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Hi Roberta…



It seems like you need an auth-to-local run set up to translate
trafodion-robertaclus...@trafkdc.com to trafodion.



To can do this by editing the hadoop.security.auth_to_local property under
HDFS->Configs->Advanced->Advanced core-site.



Adding the following rule should do the trick:



RULE:[1:$1@$0](.*-robertaclus...@trafkdc.com)s/-robertaCluster@.*//



You will need to add this rule to the ruleset before/above less general
rules like



RULE:[1:$1@$0](.*@TRAFKDC.COM)s/@.*//



After adding this rule, save the config and restart the recommended
services.



I hope this helps,



Rob







*From: *Roberta Marton 
*Reply-To: *"user@ambari.apache.org" 
*Date: *Monday, March 21, 2016 at 2:08 PM
*To: *"user@ambari.apache.org" 
*Subject: *Trying to create hbase tables after enabling Kerberos with Ambari



I am trying to install Kerberos on top of my Hortonworks installation.  I
have tried this with both versions 2.2 and 2.3 and get similar results.

After I enable Kerberos, I create a Linux user called trafodion and grant

Re: Trying to create hbase tables after enabling Kerberos with Ambari

2016-03-22 Thread Henning Kropp

Roberta,

please try this resource 
https://steveloughran.gitbooks.io/kerberos_and_hadoop/content/


Hope it helps.

Regards,
Henning

Am 22/03/16 um 20:49 schrieb Roberta Marton:


As I work more with Kerberos, I have questions that I cannot seem to 
figure out from the documentation and scanning the internet.  Maybe 
you can answer them.


From Ambari documentation:

“Each service and sub-service in Hadoop must have its own principal. A 
principal name in a given realm consists of a primary name and an 
instance name, which in this case is the FQDN of the host that runs 
that service. As services do not login with a password to acquire 
their tickets, their principal's authentication credentials are stored 
in a keytab file, which is extracted from the Kerberos database and 
stored locally with the service principal on the service component host.”


As part of enabling Kerberos, Ambari creates all these service 
principals and keytabs.  So my question is, how are tickets managed 
between the Hadoop services?  For example, HBase needs to talk to HDFS 
to write some data.  If I instigate this request, does HBase send my 
ticket to services, like HDFS, or does it intercept the request and 
send its own ticket to HDFS to manage the request.


How does HBase (and other Hadoop services) manage their own ticket 
renewal and expiration?  Do they use a thread to automatically renew 
the ticket like suggested in many forums?  What happens if the ticket 
expires in the middle of a request?  Is there code in each service to 
determine that a ticket is about to expire, and perform a kinit to 
create a new ticket and send it seamlessly down the line?


   Regards,

   Roberta

*From:* Robert Levas [mailto:rle...@hortonworks.com 
]

*Sent:* Tuesday, March 22, 2016 6:45 AM
*To:* user@ambari.apache.org 
*Subject:* Re: Trying to create hbase tables after enabling Kerberos 
with Ambari


Henning…

I didn’t know about that hadoop command.  This is awesome. Thanks!

hadoop org.apache.hadoop.security.HadoopKerberosName
trafodion-robertaclus...@trafkdc.com


Rob

*From: *Henning Kropp >
*Reply-To: *"user@ambari.apache.org " 
>

*Date: *Monday, March 21, 2016 at 5:49 PM
*To: *"user@ambari.apache.org " 
>
*Subject: *Re: Trying to create hbase tables after enabling Kerberos 
with Ambari


Hi,

what Robert suggested sounds to me exactly what you would need. It 
would help if you could provide your auth_to_local setting and the 
output of hbase> whoami


Another way to test your auth_to_locals setting would be to execute:
% hadoop org.apache.hadoop.security.HadoopKerberosName 
trafodion-robertaclus...@trafkdc.com 



Please be aware that the rules are applied in order, so it is 
important to have the rule from Robert before the default rule.


A more simple rule could also be:
RULE:[1:$1@$0](trafidion-robertaclus...@trafkdc.com 
)s/.*/trafodion/


The above rule will only work for this principal/user. Put it as the 
first line of your auth to local and use HadoopKerberosName to test if 
it is working.


Regards,
Henning

Am 21/03/16 um 21:40 schrieb Roberta Marton:

Thanks for your suggestion.  My property settings did have the
second rule defined but not the first.

However, it did not seem to help.

I tried setting the rule several other ways but nothing seems to
work.  I still get the same behavior.

Roberta

*From:*Robert Levas [mailto:rle...@hortonworks.com
]
*Sent:* Monday, March 21, 2016 11:21 AM
*To:* user@ambari.apache.org 
*Subject:* Re: Trying to create hbase tables after enabling
Kerberos with Ambari

Hi Roberta…

It seems like you need an auth-to-local run set up to translate
trafodion-robertaclus...@trafkdc.com
to trafodion.

To can do this by editing the hadoop.security.auth_to_local
property under HDFS->Configs->Advanced->Advanced core-site.

Adding the following rule should do the trick:

RULE:[1:$1@$0](.*-robertaclus...@trafkdc.com)s/-robertaCluster@.*//


You will need to add this rule to the ruleset before/above less
general rules like

RULE:[1:$1@$0](.*@TRAFKDC.COM)s/@.*//


After adding this rule, save the config and restart the
recommended services.

I hope this helps,

Rob

*From: *Roberta Marton 

RE: Trying to create hbase tables after enabling Kerberos with Ambari

2016-03-22 Thread Roberta Marton
As I work more with Kerberos, I have questions that I cannot seem to figure
out from the documentation and scanning the internet.  Maybe you can answer
them.



>From Ambari documentation:



“Each service and sub-service in Hadoop must have its own principal. A
principal name in a given realm consists of a primary name and an instance
name, which in this case is the FQDN of the host that runs that service. As
services do not login with a password to acquire their tickets, their
principal's authentication credentials are stored in a keytab file, which
is extracted from the Kerberos database and stored locally with the service
principal on the service component host.”



As part of enabling Kerberos, Ambari creates all these service principals
and keytabs.  So my question is, how are tickets managed between the Hadoop
services?  For example, HBase needs to talk to HDFS to write some data.  If
I instigate this request, does HBase send my ticket to services, like HDFS,
or does it intercept the request and send its own ticket to HDFS to manage
the request.



How does HBase (and other Hadoop services) manage their own ticket renewal
and expiration?  Do they use a thread to automatically renew the ticket
like suggested in many forums?  What happens if the ticket expires in the
middle of a request?  Is there code in each service to determine that a
ticket is about to expire, and perform a kinit to create a new ticket and
send it seamlessly down the line?



   Regards,

   Roberta



*From:* Robert Levas [mailto:rle...@hortonworks.com]
*Sent:* Tuesday, March 22, 2016 6:45 AM
*To:* user@ambari.apache.org
*Subject:* Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Henning…



I didn’t know about that hadoop command.  This is awesome. Thanks!



hadoop org.apache.hadoop.security.HadoopKerberosName
trafodion-robertaclus...@trafkdc.com



Rob





*From: *Henning Kropp 
*Reply-To: *"user@ambari.apache.org" 
*Date: *Monday, March 21, 2016 at 5:49 PM
*To: *"user@ambari.apache.org" 
*Subject: *Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Hi,

what Robert suggested sounds to me exactly what you would need. It would
help if you could provide your auth_to_local setting and the output of
hbase> whoami

Another way to test your auth_to_locals setting would be to execute:
% hadoop org.apache.hadoop.security.HadoopKerberosName
trafodion-robertaclus...@trafkdc.com

Please be aware that the rules are applied in order, so it is important to
have the rule from Robert before the default rule.

A more simple rule could also be:
RULE:[1:$1@$0](trafidion-robertaclus...@trafkdc.com)s/.*/trafodion/

The above rule will only work for this principal/user. Put it as the first
line of your auth to local and use HadoopKerberosName to test if it is
working.

Regards,
Henning

Am 21/03/16 um 21:40 schrieb Roberta Marton:

Thanks for your suggestion.  My property settings did have the second rule
defined but not the first.

However, it did not seem to help.

I tried setting the rule several other ways but nothing seems to work.  I
still get the same behavior.



   Roberta



*From:* Robert Levas [mailto:rle...@hortonworks.com]
*Sent:* Monday, March 21, 2016 11:21 AM
*To:* user@ambari.apache.org
*Subject:* Re: Trying to create hbase tables after enabling Kerberos with
Ambari



Hi Roberta…



It seems like you need an auth-to-local run set up to translate
trafodion-robertaclus...@trafkdc.com to trafodion.



To can do this by editing the hadoop.security.auth_to_local property under
HDFS->Configs->Advanced->Advanced core-site.



Adding the following rule should do the trick:



RULE:[1:$1@$0](.*-robertaclus...@trafkdc.com)s/-robertaCluster@.*//



You will need to add this rule to the ruleset before/above less general
rules like



RULE:[1:$1@$0](.*@TRAFKDC.COM)s/@.*//



After adding this rule, save the config and restart the recommended
services.



I hope this helps,



Rob







*From: *Roberta Marton 
*Reply-To: *"user@ambari.apache.org" 
*Date: *Monday, March 21, 2016 at 2:08 PM
*To: *"user@ambari.apache.org" 
*Subject: *Trying to create hbase tables after enabling Kerberos with Ambari



I am trying to install Kerberos on top of my Hortonworks installation.  I
have tried this with both versions 2.2 and 2.3 and get similar results.

After I enable Kerberos, I create a Linux user called trafodion and grant
this user all HBase permissions.

I connect as trafodion but get permission errors when I try to create a
table.



Details:



[trafodion@myhost ~]$ whoami

trafodion



[trafodion@myhost ~]$ klist

Ticket cache: FILE:/tmp/krb5cc_503 

Default principal: trafodion-robertaclus...@trafkdc.com



Valid starting ExpiresService principal

03/21/16 16:39:33  03/22/16 16:39:33  krbtgt/trafkdc@trafkdc.com

renew 

Re: Run Service Check on a certain node

2016-03-22 Thread Sumit Mohanty
I thought the Service Check is only run on CLIENT components associated with 
the Service. However, if there are no CLIENT components then it may be picking 
up any host that has any component. Does your service only has a MASTER 
component?


From: Pei Heng Zeng 
Sent: Tuesday, March 22, 2016 11:32 AM
To: user@ambari.apache.org
Subject: Run Service Check on a certain node

Hi,

I am trying to implement a Service Check feature for a custom service in 
Ambari. I am running into a problem where I have no control over which node the 
Service Check is run on. This is a problem because the Service Check is only 
designed to run on the node where the Master of the service resides. If the 
Master happens to be on the node that Ambari picks to run the Service Check, 
then the Service Check will run properly. Otherwise, it will basically be a 
no-op.

Did anyone else run into similar issues?

Your response is greatly appreciated.

Sincerely,

Pei Heng Zeng


Run Service Check on a certain node

2016-03-22 Thread Pei Heng Zeng
 Hi,

I am trying to implement a Service Check feature for a custom service in
Ambari. I am running into a problem where I have no control over which node
the Service Check is run on. This is a problem because the Service Check is
only designed to run on the node where the Master of the service resides.
If the Master happens to be on the node that Ambari picks to run the
Service Check, then the Service Check will run properly. Otherwise, it will
basically be a no-op.

Did anyone else run into similar issues?

Your response is greatly appreciated.

Sincerely,

Pei Heng Zeng


Re: Trying to create hbase tables after enabling Kerberos with Ambari

2016-03-22 Thread Robert Levas
Henning…

I didn’t know about that hadoop command.  This is awesome. Thanks!

hadoop org.apache.hadoop.security.HadoopKerberosName 
 
trafodion-robertaclus...@trafkdc.com

Rob


From: Henning Kropp >
Reply-To: "user@ambari.apache.org" 
>
Date: Monday, March 21, 2016 at 5:49 PM
To: "user@ambari.apache.org" 
>
Subject: Re: Trying to create hbase tables after enabling Kerberos with Ambari

Hi,

what Robert suggested sounds to me exactly what you would need. It would help 
if you could provide your auth_to_local setting and the output of hbase> whoami

Another way to test your auth_to_locals setting would be to execute:
% hadoop org.apache.hadoop.security.HadoopKerberosName 
 
trafodion-robertaclus...@trafkdc.com

Please be aware that the rules are applied in order, so it is important to have 
the rule from Robert before the default rule.

A more simple rule could also be:

RULE:[1:$1@$0](trafidion-robertaclus...@trafkdc.com)s/.*/trafodion/

The above rule will only work for this principal/user. Put it as the first line 
of your auth to local and use HadoopKerberosName to test if it is working.

Regards,
Henning


Am 21/03/16 um 21:40 schrieb Roberta Marton:
Thanks for your suggestion.  My property settings did have the second rule 
defined but not the first.
However, it did not seem to help.
I tried setting the rule several other ways but nothing seems to work.  I still 
get the same behavior.

   Roberta

From: Robert Levas 
[mailto:rle...@hortonworks.com]
Sent: Monday, March 21, 2016 11:21 AM
To: user@ambari.apache.org
Subject: Re: Trying to create hbase tables after enabling Kerberos with Ambari

Hi Roberta…

It seems like you need an auth-to-local run set up to translate 
 
trafodion-robertaclus...@trafkdc.com
 to trafodion.

To can do this by editing the hadoop.security.auth_to_local property under 
HDFS->Configs->Advanced->Advanced core-site.

Adding the following rule should do the trick:

RULE:[1:$1@$0](.*-robertaclus...@trafkdc.com)s/-robertaCluster@.*//

You will need to add this rule to the ruleset before/above less general rules 
like

RULE:[1:$1@$0](.*@TRAFKDC.COM)s/@.*//

After adding this rule, save the config and restart the recommended services.

I hope this helps,

Rob



From: Roberta Marton 
<roberta.mar...@esgyn.com>
Reply-To: "user@ambari.apache.org" 
>
Date: Monday, March 21, 2016 at 2:08 PM
To: "user@ambari.apache.org" 
>
Subject: Trying to create hbase tables after enabling Kerberos with Ambari

I am trying to install Kerberos on top of my Hortonworks installation.  I have 
tried this with both versions 2.2 and 2.3 and get similar results.
After I enable Kerberos, I create a Linux user called trafodion and grant this 
user all HBase permissions.
I connect as trafodion but get permission errors when I try to create a table.

Details:

[trafodion@myhost ~]$ whoami
trafodion

[trafodion@myhost ~]$ klist
Ticket cache: FILE:/tmp/krb5cc_503
Default principal: 
trafodion-robertaclus...@trafkdc.com

Valid starting ExpiresService principal
03/21/16 16:39:33  03/22/16 16:39:33   
krbtgt/trafkdc@trafkdc.com
renew until 03/21/16 16:39:33

hbase shell

hbase(main):002:0> whoami
trafodion-robertaclus...@trafkdc.com(auth:KERBEROS)OIw
2016-03-21 17:06:22,925 WARN  [main] security.UserGroupInformation: No groups 
available for user trafodion-robertaCluster

hbase(main):003:0> user_permission
UserTable,Family,Qualifier:Permission
trafodion  hbase:acl,,: [Permission: 
actions=READ,WRITE,EXEC,CREATE,ADMIN]
ambari-qa  hbase:acl,,: [Permission: 
actions=READ,WRITE,EXEC,CREATE,ADMIN]
2 row(s) in 1.7630 seconds

hbase(main):004:0> create 't1', 'f1', 'f2'

ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient 
permissions for user 'trafodion-robertaCluster' (global, 

RE: Do we have API for change password

2016-03-22 Thread Satyanarayana Jampa
Got it working
curl -w "%{http_code}" -o /dev/null --user admin:admin 

From: Satyanarayana Jampa [mailto:sja...@innominds.com]
Sent: 22 March 2016 09:55
To: user@ambari.apache.org
Subject: RE: Do we have API for change password

Can we get the HTTP response code in this?

From: Dmitry Sen [mailto:d...@hortonworks.com]
Sent: 21 March 2016 20:02
To: user@ambari.apache.org
Subject: Re: Do we have API for change password


It works for me:



curl -i -u admin:admin  'http://192.168.120.6:1081/api/v1/users/admin' -X PUT 
-H 'X-Requested-By: ambari' --data-binary 
'{"Users/password":"newpasswd","Users/old_password":"admin"}'



BR,

Dmytro Sen




From: Satyanarayana Jampa >
Sent: Monday, March 21, 2016 7:59 AM
To: user@ambari.apache.org
Subject: Do we have API for change password

Hi,
  Just wondering if we have an API to change the Ambari password, instead of 
logging in and changing the password.
  Actually I have a custom script which will take username and password as 
argument which I have to use it for configuration before installing Ambari.

Thanks,
Satya.