[jira] [Updated] (SPARK-2593) Add ability to pass an existing Akka ActorSystem into Spark
[ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Helena Edelson updated SPARK-2593: -- Description: As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. was: As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the programmatic creation would simply be this for a user val extension = SparkCassandra(system) Add ability to pass an existing Akka ActorSystem into Spark --- Key: SPARK-2593 URL: https://issues.apache.org/jira/browse/SPARK-2593 Project: Spark Issue Type: Improvement Components: Spark Core Reporter: Helena Edelson As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-2593) Add ability to pass an existing Akka ActorSystem into Spark
[ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Helena Edelson updated SPARK-2593: -- Issue Type: Improvement (was: Brainstorming) Add ability to pass an existing Akka ActorSystem into Spark --- Key: SPARK-2593 URL: https://issues.apache.org/jira/browse/SPARK-2593 Project: Spark Issue Type: Improvement Components: Spark Core Reporter: Helena Edelson As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. If it makes sense... I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the creation would simply be this for a user val extension = SparkCassandra(system) and using is as easy as: import extension._ spark. // do work or, streaming. // do work and all config comes from reference.conf and user overrides of that. The conf file would pick up settings from the deployed environment first, then fallback to -D with a final fallback to configured settings. -- This message was sent by Atlassian JIRA (v6.2#6252)
[jira] [Updated] (SPARK-2593) Add ability to pass an existing Akka ActorSystem into Spark
[ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Helena Edelson updated SPARK-2593: -- Description: As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the programmatic creation would simply be this for a user val extension = SparkCassandra(system) was: As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. If it makes sense... I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the creation would simply be this for a user val extension = SparkCassandra(system) and using is as easy as: import extension._ spark. // do work or, streaming. // do work and all config comes from reference.conf and user overrides of that. The conf file would pick up settings from the deployed environment first, then fallback to -D with a final fallback to configured settings. Add ability to pass an existing Akka ActorSystem into Spark --- Key: SPARK-2593 URL: https://issues.apache.org/jira/browse/SPARK-2593 Project: Spark Issue Type: Improvement Components: Spark Core Reporter: Helena Edelson As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application. This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system. I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the programmatic creation would simply be this for a user val extension = SparkCassandra(system) -- This message was sent by Atlassian JIRA (v6.2#6252)