[ 
https://issues.apache.org/jira/browse/SPARK-41073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhengchenyu updated SPARK-41073:
--------------------------------
    Description: 
In our cluster, zookeeper nearly crashed. I found the znodes of 
/zkdtsm/ZKDTSMRoot/ZKDTSMTokensRoot increased quickly. 
After some research, I found some sql running on spark-thriftserver obtain huge 
amounts of DelegationToken.
The reason is that in these spark-sql, every hive parition acquire a different 
delegation token. 
The rease is that HadoopRDDs in thriftserver can't share credentials from . 

  was:
In our cluster, zookeeper nearly crashed. I found the znodes of 
/zkdtsm/ZKDTSMRoot/ZKDTSMTokensRoot increased quickly. 
After some research, I found some sql running on spark-thriftserver obtain huge 
amounts of DelegationToken.
The reason is that in these spark-sql, every hive parition acquire a different 
delegation token. 
HadoopRDDs can't share delegation token. Thriftserver should share the 
delegation token from HadoopDelegationTokenManager.


> Spark ThriftServer generate huge amounts of DelegationToken
> -----------------------------------------------------------
>
>                 Key: SPARK-41073
>                 URL: https://issues.apache.org/jira/browse/SPARK-41073
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.1
>            Reporter: zhengchenyu
>            Priority: Major
>
> In our cluster, zookeeper nearly crashed. I found the znodes of 
> /zkdtsm/ZKDTSMRoot/ZKDTSMTokensRoot increased quickly. 
> After some research, I found some sql running on spark-thriftserver obtain 
> huge amounts of DelegationToken.
> The reason is that in these spark-sql, every hive parition acquire a 
> different delegation token. 
> The rease is that HadoopRDDs in thriftserver can't share credentials from . 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to