[ 
https://issues.apache.org/jira/browse/SPARK-6703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14495777#comment-14495777
 ] 

Evan Chan commented on SPARK-6703:
----------------------------------

I should note:

Having the jobserver support generic apps that don't implement an interface is 
an interesting idea (well, more like an implicit Trait { def main(args: 
Array[String]) }, I suppose).  The only way I could think of to have them share 
a context would be to have the job server load the job jars and start / call 
the main method.   So what you describe might be useful.

We also do have users working with multiple contexts in the same JVM.  However 
we are working on support for one JVM per context.

> Provide a way to discover existing SparkContext's
> -------------------------------------------------
>
>                 Key: SPARK-6703
>                 URL: https://issues.apache.org/jira/browse/SPARK-6703
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Patrick Wendell
>            Assignee: Ilya Ganelin
>            Priority: Critical
>
> Right now it is difficult to write a Spark application in a way that can be 
> run independently and also be composed with other Spark applications in an 
> environment such as the JobServer, notebook servers, etc where there is a 
> shared SparkContext.
> It would be nice to provide a rendez-vous point so that applications can 
> learn whether an existing SparkContext already exists before creating one.
> The most simple/surgical way I see to do this is to have an optional static 
> SparkContext singleton that people can be retrieved as follows:
> {code}
> val sc = SparkContext.getOrCreate(conf = new SparkConf())
> {code}
> And you could also have a setter where some outer framework/server can set it 
> for use by multiple downstream applications.
> A more advanced version of this would have some named registry or something, 
> but since we only support a single SparkContext in one JVM at this point 
> anyways, this seems sufficient and much simpler. Another advanced option 
> would be to allow plugging in some other notion of configuration you'd pass 
> when retrieving an existing context.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to