[ 
https://issues.apache.org/jira/browse/SPARK-19143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16656476#comment-16656476
 ] 

Steve Loughran commented on SPARK-19143:
----------------------------------------

I'm looking at this as I add DTs into S3A (HADOOP-14556) and seeing how 
downstream stuff picks this up. I'm looking at the issue of "can a user submit 
a new set of tokens from their machine and have that propagate", especially in 
the context of "not giving a long-lived cluster my keytab because IT won't give 
it to me"

Has anyone done anything recent here.

> API in Spark for distributing new delegation tokens (Improve delegation token 
> handling in secure clusters)
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19143
>                 URL: https://issues.apache.org/jira/browse/SPARK-19143
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, YARN
>    Affects Versions: 2.0.2, 2.1.0
>            Reporter: Ruslan Dautkhanov
>            Priority: Major
>
> Spin off from SPARK-14743 and comments chain in [recent comments| 
> https://issues.apache.org/jira/browse/SPARK-5493?focusedCommentId=15802179&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15802179]
>  in SPARK-5493.
> Spark currently doesn't have a way for distribution new delegation tokens. 
> Quoting [~vanzin] from SPARK-5493 
> {quote}
> IIRC Livy doesn't yet support delegation token renewal. Once it reaches the 
> TTL, the session is unusable.
> There might be ways to hack support for that without changes in Spark, but 
> I'd like to see a proper API in Spark for distributing new delegation tokens. 
> I mentioned that in SPARK-14743, but although that bug is closed, that 
> particular feature hasn't been implemented yet.
> {quote}
> Other thoughts?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to