[ 
https://issues.apache.org/jira/browse/SPARK-6703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14492909#comment-14492909
 ] 

Ilya Ganelin commented on SPARK-6703:
-------------------------------------

Patrick - what¹s the time line for the 1.4 release? Just want to have a
sense for it so I can schedule accordingly.

Thank you, 
Ilya Ganelin








________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to 
Capital One and/or its affiliates. The information transmitted herewith is 
intended only for use by the individual or entity to which it is addressed.  If 
the reader of this message is not the intended recipient, you are hereby 
notified that any review, retransmission, dissemination, distribution, copying 
or other use of, or taking of any action in reliance upon this information is 
strictly prohibited. If you have received this communication in error, please 
contact the sender and delete the material from your computer.



> Provide a way to discover existing SparkContext's
> -------------------------------------------------
>
>                 Key: SPARK-6703
>                 URL: https://issues.apache.org/jira/browse/SPARK-6703
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 1.3.0
>            Reporter: Patrick Wendell
>            Assignee: Ilya Ganelin
>            Priority: Critical
>
> Right now it is difficult to write a Spark application in a way that can be 
> run independently and also be composed with other Spark applications in an 
> environment such as the JobServer, notebook servers, etc where there is a 
> shared SparkContext.
> It would be nice to provide a rendez-vous point so that applications can 
> learn whether an existing SparkContext already exists before creating one.
> The most simple/surgical way I see to do this is to have an optional static 
> SparkContext singleton that people can be retrieved as follows:
> {code}
> val sc = SparkContext.getOrCreate(conf = new SparkConf())
> {code}
> And you could also have a setter where some outer framework/server can set it 
> for use by multiple downstream applications.
> A more advanced version of this would have some named registry or something, 
> but since we only support a single SparkContext in one JVM at this point 
> anyways, this seems sufficient and much simpler. Another advanced option 
> would be to allow plugging in some other notion of configuration you'd pass 
> when retrieving an existing context.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to