Hi,


The spark default behaviour is to request a brand new token every 24 hours, it 
is not going to renew delegation tokens, and it is the better approach for long 
running applications like streaming ones.



In our use case using keytab and principal is working fine with 
hdfs_delegation_token but is NOT working with “kms-dt”.



Anyone knows why this is happening ? Any suggestion to make it working with KMS 
?



Thanks







[cid:image001.jpg@01D41D15.E01B6F00]


Paolo Platter

CTO

E-mail:        paolo.plat...@agilelab.it<mailto:paolo.plat...@agilelab.it>

Web Site:   www.agilelab.it<http://www.agilelab.it/>






________________________________
Da: Marcelo Vanzin <van...@cloudera.com.INVALID>
Inviato: Thursday, January 3, 2019 7:03:22 PM
A: alinazem...@gmail.com
Cc: user
Oggetto: Re: How to reissue a delegated token after max lifetime passes for a 
spark streaming application on a Kerberized cluster

If you are using the principal / keytab params, Spark should create
tokens as needed. If it's not, something else is going wrong, and only
looking at full logs for the app would help.
On Wed, Jan 2, 2019 at 5:09 PM Ali Nazemian <alinazem...@gmail.com> wrote:
>
> Hi,
>
> We are using a headless keytab to run our long-running spark streaming 
> application. The token is renewed automatically every 1 day until it hits the 
> max life limit. The problem is token is expired after max life (7 days) and 
> we need to restart the job. Is there any way we can re-issue the token and 
> pass it to a job that is already running? It doesn't feel right at all to 
> restart the job every 7 days only due to the token issue.
>
> P.S: We use  "--keytab /path/to/the/headless-keytab", "--principal 
> principalNameAsPerTheKeytab" and "--conf 
> spark.hadoop.fs.hdfs.impl.disable.cache=true" as the arguments for 
> spark-submit command.
>
> Thanks,
> Ali



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to