[ https://issues.apache.org/jira/browse/SPARK-32119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kousuke Saruta updated SPARK-32119: ----------------------------------- Description: ExecutorPlugin can't work with Standalone Cluster (maybe with other cluster manager too except YARN. ) when a jar which contains plugins and files used by the plugins are added by --jars and --files option with spark-submit. This is because jars and files added by --jars and --files are not loaded on Executor initialization. I confirmed it works with YARN because jars/files are distributed as distributed cache. was: ExecutorPlugin can't work with Standalone Cluster (maybe with other cluster manager too except YARN. ) when a jar which contains plugins and files used by the plugins are added by --jars and --files option with spark-submit. This is because jars and files added by --jars and --files are not loaded on Executor initialization. I confirmed it works **with YARN because jars/files are distributed as distributed cache. > ExecutorPlugin doesn't work with Standalone Cluster > --------------------------------------------------- > > Key: SPARK-32119 > URL: https://issues.apache.org/jira/browse/SPARK-32119 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 3.1.0 > Reporter: Kousuke Saruta > Assignee: Kousuke Saruta > Priority: Major > > ExecutorPlugin can't work with Standalone Cluster (maybe with other cluster > manager too except YARN. ) > when a jar which contains plugins and files used by the plugins are added by > --jars and --files option with spark-submit. > This is because jars and files added by --jars and --files are not loaded on > Executor initialization. > I confirmed it works with YARN because jars/files are distributed as > distributed cache. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org