[ https://issues.apache.org/jira/browse/SPARK-3270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14143732#comment-14143732 ]
Michal Malohlava commented on SPARK-3270: ----------------------------------------- Hi Patrick, you are right - in the case of an independent component, we can initialize them lazily with a task. Nevertheless, if all components inside all Executors need to share a common knowledge, then lazy initialization is little bit cumbersome. In this JIRA, we do not want to propose any heavy-weight generic discovery system, but just a lightweight way of running code inside Spark infrastructure without modifying Spark core code (i would compare it to Linux kernel drivers). > Spark API for Application Extensions > ------------------------------------ > > Key: SPARK-3270 > URL: https://issues.apache.org/jira/browse/SPARK-3270 > Project: Spark > Issue Type: New Feature > Components: Spark Core > Reporter: Michal Malohlava > > Any application should be able to enrich spark infrastructure by services > which are not available by default. > Hence, to support such application extensions (aka "extesions"/"plugins") > Spark platform should provide: > - an API to register an extension > - an API to register a "service" (meaning provided functionality) > - well-defined points in Spark infrastructure which can be enriched/hooked > by extension > - a way of deploying extension (for example, simply putting the extension > on classpath and using Java service interface) > - a way to access extension from application > Overall proposal is available here: > https://docs.google.com/document/d/1dHF9zi7GzFbYnbV2PwaOQ2eLPoTeiN9IogUe4PAOtrQ/edit?usp=sharing > Note: In this context, I do not mean reinventing OSGi (or another plugin > platform) but it can serve as a good starting point. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org