[ 
https://issues.apache.org/jira/browse/SPARK-33182?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-33182.
----------------------------------
    Resolution: Incomplete

> Using Delegation Tokens to access HBase through Spark (Java)
> ------------------------------------------------------------
>
>                 Key: SPARK-33182
>                 URL: https://issues.apache.org/jira/browse/SPARK-33182
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 2.3.0
>         Environment: Spark 2.3, RHEL 7.5, HDP 3
>            Reporter: Rishi S Balajis
>            Priority: Major
>              Labels: hbase, kerberos
>
> I have a requirement to access a kerberized HBase through delegation tokens 
> instead of keytab. I have generated the token using the token util API and 
> also loaded it back into the UserGroupInformation. However, 
> hasKerberosCredentials returns a false. I am looking for the  the right way 
> to use a saved delegation token to access HBase . The code that I have 
> currently looks like shown below. I see the error :   
> {code:java}
> client.RpcRetryingCallerImpl: Call exception, tries=6, retries=36, 
> started=4850 ms ago, cancelled=false, msg=Connection closed, details=row 
> 'employee,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, 
> hostname=myhost.com,16020,1602758016346, seqNum=-1{code}
>  
> {code:java}
> UserGroupInformation ugi;
>  ugi = UserGroupInformation.getLoginUser();
>  Credentials creds = new Credentials ();
>  creds = Credentials.readTokenStorageFile (new 
> Path("file:///zen-volume-home/tokenFile"),conf);
>  
>  System.out.println("TOKEN *********" + creds.getToken(new 
> org.apache.hadoop.io.Text("hbase")));
>  ugi.addToken(creds.getToken(new org.apache.hadoop.io.Text("hbase")));
>  ugi.addCredentials(creds);
> /* I do see the token getting printed. However, I am looking for information 
> on how to use this ugi which has the token added to it, to access data on 
> HBase.
> I have tried doing this :  */
> SQLContext sqlC = ugi.getLoginUser().doAs((new 
> PrivilegedExceptionAction<SQLContext>() {
>                           public SQLContext run() throws Exception {
>                    
>                    sparkconf = new SparkConf().setAppName("Hbase With Spark");
>                    jsc = new JavaSparkContext(sparkconf);
>                    hbaseContext = new JavaHBaseContext(jsc, conf);
>                sqlContext = new org.apache.spark.sql.SQLContext(jsc);         
>                                       
>                String sqlMapping  = "name STRING all:name" ;  
>                Dataset<Row> dfEmp = 
> sqlContext.read().format("org.apache.hadoop.hbase.spark")
>                                                       
> .option("hbase.columns.mapping", sqlMapping)                                  
>                   .option("hbase.table", "employee").load();
>                                               
>                                       dfEmp.registerTempTable("empdata");  
>                                       dfEmp.show();
>                                       return sqlContext;
>                           }
>                       }));
>                                       {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to