[ 
https://issues.apache.org/jira/browse/SPARK-3270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Michal Malohlava updated SPARK-3270:
------------------------------------
    Description: 
Any application should be able to enrich spark infrastructure by services which 
are not available by default.  

Hence, to support such application extensions (aka "extesions"/"plugins") Spark 
platform should provide:
  - an API to register an extension 
  - an API to register a "service" (meaning provided functionality)
  - well-defined points in Spark infrastructure which can be enriched/hooked by 
extension
  - a way of deploying extension (for example, simply putting the extension on 
classpath and using Java service interface)
  - a way to access extension from application

Overall proposal is available here: 
https://docs.google.com/document/d/1dHF9zi7GzFbYnbV2PwaOQ2eLPoTeiN9IogUe4PAOtrQ/edit?usp=sharing

Note: In this context, I do not mean reinventing OSGi (or another plugin 
platform) but it can serve as a good starting point.


  was:
At the begining, let's clarify my motivation - I would like to extend Spark 
platform by an embedded application (e.g., monitoring network performance in 
the context of selected applications) which will be launched on particular 
nodes in cluster with their launch.
Nevertheless, I do not want to modify Spark code directly and hardcode my code 
in, but I would prefer to provide a jar which would be registered and launched 
by Spark itself. 

Hence, to support such 3rd party applications (aka "extesions"/"plugins") Spark 
platform should provide at least:
  - an API to register an extension 
  - an API to register a "service" (meaning provided functionality)
  - well-defined points in Spark infrastructure which can be enriched/hooked by 
extension
     - in master/worker lifecycle
     - in applications lifecycle
     - in RDDs lifecycle
     - monitoring/reporting
     - ...
  - a way of deploying extension (for example, simply putting the extension on 
classpath and using Java service interface)

In this context, I do not mean reinventing OSGi (or another plugin platform) 
but it can serve as a good starting point.




> Spark API for Application Extensions
> ------------------------------------
>
>                 Key: SPARK-3270
>                 URL: https://issues.apache.org/jira/browse/SPARK-3270
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Michal Malohlava
>
> Any application should be able to enrich spark infrastructure by services 
> which are not available by default.  
> Hence, to support such application extensions (aka "extesions"/"plugins") 
> Spark platform should provide:
>   - an API to register an extension 
>   - an API to register a "service" (meaning provided functionality)
>   - well-defined points in Spark infrastructure which can be enriched/hooked 
> by extension
>   - a way of deploying extension (for example, simply putting the extension 
> on classpath and using Java service interface)
>   - a way to access extension from application
> Overall proposal is available here: 
> https://docs.google.com/document/d/1dHF9zi7GzFbYnbV2PwaOQ2eLPoTeiN9IogUe4PAOtrQ/edit?usp=sharing
> Note: In this context, I do not mean reinventing OSGi (or another plugin 
> platform) but it can serve as a good starting point.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to