Hi,

You can configure multiple realms inside /etc/krb5.conf on linux hosts, it will 
also require relevant DNS configuration and network access in order to work. 
See: https://stackoverflow.com/questions/26382936/multiple-realms
Proxy-users won’t help you in any way, proxy-users rely on the fact that your 
service A from realm A can authenticate against service B from realm B, but if 
service’s A ticket is untrusted by service B from realm B, authentication will 
fail (that’s why cross-realm works). By the way proxy-users only request a 
token for user ‘toto’, but you already have user ‘toto’ keytab so it’s 
irrelevant.
If you need slightly more dynamic (considering keytabs management) you can take 
a look at s4u2proxy and s4u2self, but those require admin access to KDC in 
order to be configured and a compatible kerberos implementation (Freeipa, MIT 
Kerberos and Active Directory).

Regards

De : tobe <tobeg3oo...@gmail.com>
Envoyé : mardi 24 décembre 2019 08:15
À : Vinod Kumar Vavilapalli <vino...@apache.org>
Cc : user.hadoop <user@hadoop.apache.org>
Objet : Re: How can we access multiple Kerberos-enabled Hadoop with different 
users in single JVM process

Thanks @Vinod  and proxy-users was considered.

But what we want to support is accessing multiple secured Hadoop. If we want to 
initialize the Kerberos credentials, we need config the file of /etc/krb5.conf. 
If we want to access two different Kerberos services(specified KDC), we can not 
run JVM process with two files of /etc/krb5.conf. That is why cross-realm can 
work because we only need to login with one KDC. Since we can take users' 
keytab files and proxy is not the critical problem for us.

Please correct me if proxy-users can proxy different users from multiple 
secured Hadoop clusters.


Regards

On Tue, Dec 24, 2019 at 1:14 PM Vinod Kumar Vavilapalli 
<vino...@apache.org<mailto:vino...@apache.org>> wrote:
You are looking for the proxy-users pattern. See here: 
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/Superusers.html

Thanks
+Vinod


On Dec 24, 2019, at 9:49 AM, tobe 
<tobeg3oo...@gmail.com<mailto:tobeg3oo...@gmail.com>> wrote:

Currently Hadoop relies on Kerberos to do authentication and authorization. For 
single user, we can initialize  clients with keytab files in command-line or 
Java program.

But sometimes we need to access Hadoop as multiple users. For example, we build 
the web service to view users' HDFS files. We have authorization to get user 
name and use this user's keytab to login before requesting HDFS. However, this 
doesn't work for multiple Hadoop clusters and multiple KDC.

Currently the only way to do that is enable cross-realm for these KDC. But in 
some scenarios we can not change the configuration of KDC and want single 
process to switch the Kerberos user on the fly without much overhead.

Here is the related discussion in StackOverflow:
·  
https://stackoverflow.com/questions/15126295/using-java-programmatically-log-in-multiple-kerberos-realms-with-different-keyta#<https://stackoverflow.com/questions/15126295/using-java-programmatically-log-in-multiple-kerberos-realms-with-different-keyta>
·  
https://stackoverflow.com/questions/57008499/data-transfer-between-two-kerberos-secured-cluster
 ,
·  
https://stackoverflow.com/questions/22047145/hadoop-distcp-between-two-securedkerberos-clusters
 ,
·  
https://stackoverflow.com/questions/39648106/access-two-secured-kerberos-hadoop-hbase-clusters-from-the-same-process
·  
https://stackoverflow.com/questions/1437281/reload-kerberos-config-in-java-without-restarting-jvm

Regards

Reply via email to