Github user NiharS commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22192#discussion_r216210046
  
    --- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
    @@ -136,6 +136,26 @@ private[spark] class Executor(
       // for fetching remote cached RDD blocks, so need to make sure it uses 
the right classloader too.
       env.serializerManager.setDefaultClassLoader(replClassLoader)
     
    +  private val executorPlugins: Seq[ExecutorPlugin] = {
    +    val pluginNames = conf.get(EXECUTOR_PLUGINS)
    +    if (pluginNames.nonEmpty) {
    +      logDebug(s"Initializing the following plugins: 
${pluginNames.mkString(", ")}")
    +
    +      // Plugins need to load using a class loader that includes the 
executor's user classpath
    +      val pluginList: Seq[ExecutorPlugin] =
    +        Utils.withContextClassLoader(replClassLoader) {
    +          val plugins = Utils.loadExtensions(classOf[ExecutorPlugin], 
pluginNames, conf)
    +          plugins.foreach(_.init())
    --- End diff --
    
    I think that should be the right behavior, since if a plugin is faulty it 
might be a good idea for the user to handle that issue first. Since plugins are 
opt-in, a user providing one would expect it to function once it's included, 
and would probably want to know if it fails rather than run their entire job 
just to see that it didn't do what it was supposed to do


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to