[ 
https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Helena Edelson updated SPARK-2593:
----------------------------------

    Description: 
As a developer I want to pass an existing ActorSystem into StreamingContext in 
load-time so that I do not have 2 actor systems running on a node in an Akka 
application.

This would mean having spark's actor system on its own named-dispatchers as 
well as exposing the new private creation of its own actor system.
 
I would like to create an Akka Extension that wraps around Spark/Spark 
Streaming and Cassandra. So the programmatic creation would simply be this for 
a user

val extension = SparkCassandra(system)
 

  was:
As a developer I want to pass an existing ActorSystem into StreamingContext in 
load-time so that I do not have 2 actor systems running on a node.

This would mean having spark's actor system on its own named-dispatchers as 
well as exposing the new private creation of its own actor system.

If it makes sense...

I would like to create an Akka Extension that wraps around Spark/Spark 
Streaming and Cassandra. So the creation would simply be this for a user

val extension = SparkCassandra(system)
 and using is as easy as:

import extension._
spark. // do work or, 
streaming. // do work
 
and all config comes from reference.conf and user overrides of that.
The conf file would pick up settings from the deployed environment first, then 
fallback to -D with a final fallback to configured settings.




> Add ability to pass an existing Akka ActorSystem into Spark
> -----------------------------------------------------------
>
>                 Key: SPARK-2593
>                 URL: https://issues.apache.org/jira/browse/SPARK-2593
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Helena Edelson
>
> As a developer I want to pass an existing ActorSystem into StreamingContext 
> in load-time so that I do not have 2 actor systems running on a node in an 
> Akka application.
> This would mean having spark's actor system on its own named-dispatchers as 
> well as exposing the new private creation of its own actor system.
>  
> I would like to create an Akka Extension that wraps around Spark/Spark 
> Streaming and Cassandra. So the programmatic creation would simply be this 
> for a user
> val extension = SparkCassandra(system)
>  



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to