Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/14065
  
    @tgravescs and @vanzin , these days I did some code refactoring work on 
this patch. Here listed changes I did compared to previous code:
    
    1. Change the interface `ServiceTokenProvider` to 
`ServiceCredentialProvider` with the main method changed to `obtainCredentials` 
as suggested in comments.
    
        Now since we're not limiting to tokens, so instead of obtaining tokens, 
here changed to obtain credentials. And the method `obtainCredentials` is 
defined as:
        
        ```scala
         def obtainCredentials(hadoopConf: Configuration, creds: Credentials): 
Option[Long]
        ```
       
        Here the return value `Option[Long]` means the time of next renewal, 
return `Some(Long)` if this credential is renewable, otherwise returns `None`.
      
        Also remove several redundant methods like get token renewal interval 
and so on.
    
    2. Change `ConfigurableTokenManager` to `ConfigurableCredentialManager` to 
manager all the credential providers.
    3. Change the way to load credential providers to ServiceLoader as 
suggested in comments.
    4. Change initialization way from singleton to normal way.
    6. Change the mechanism of checking credentials in `AMDelegationRenewer` 
and `ExecutorDelegationTokenUpdate`. Since now we can get the time of next 
renewal, so we use this to decide when to wake up to check the new credentials.
    
    Please help to review, thanks a lot for your time and greatly appreciate 
your comments.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to