Github user mridulm commented on the issue:

    https://github.com/apache/spark/pull/17723
  
    @vanzin 
    > But it's just a small piece of the puzzle. For example, for Hadoop token 
providers, obtainCredentials has to be called with a very specific environment 
set up depending on when it's called. 
    
    On the contrary, this is handled if you read the proposal - 
ServiceTokenManager.acquireTokens will do the necessary setup : before 
acquiring tokens from BaseServiceTokenProvider's which match its service type.
    
    To elaborate:
    * configure - grab principal/keytab/configure itself based on the SparkConf 
provided.
    * acquireTokens - setup environment specific to the model.
      * Execute BaseServiceTokenProvider.obtainCredentials
      * In case of hadoop security, obtainCredentials will be invoked within 
ugi setup; in case of other models, it could be something else - that is an 
implementation detail which must not leak out.
      * serialize result to Array[Byte] and return.
    * applyTokens - at the executor's - for application of the obtained bytes.
      * Each ServiceTokenManager would have different deserialized object 
representing tokens, and different semantics on how tokens are applied to the 
executor environment.
    
    The whole point of generalizing is that existing functionality should 
continue to work, without needing to depend on specifics of hadoop-security.
    Given @rxin's comment above, I am actually more convinced that taking a 
hadoop-security only route is detrimental to future integration with other 
systems - I just did not have examples to elaborate my point.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to