Github user jerryshao commented on the issue:

    https://github.com/apache/spark/pull/17388
  
    @vanzin @tgravescs @mridulm do you think it necessary to add additional 
jars and main jar into classloader for yarn cluster mode?
    
    In my class I run Spark with HBase in secure cluster, so I need to specify 
hbase jars with `--jars` to make `HBaseCredentailProvider` work. But 
fortunately in yarn cluster mode, this jars are not added into classloader, so 
it will fail to get HBase token with class not found issue.
    
    This also applies to the customized credential provider, if we write a 
customized one and package into main jar, then it will be failed to load by 
ServiceLoader because this main jar is not presented in client's classloader.
    
    Though this could be fixed by expanding launch classpath (like 
SPARK_CLASSPATH) as a workaround, I think a good solution is to add to child's 
classpath.
    
    What do you think, is there any concern to put these jars into child's 
classpath in yarn cluster mode? Thanks a lot.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to