[ 
https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Helena Edelson resolved SPARK-2593.
-----------------------------------
    Resolution: Won't Fix

As a user, I want to be able to use the latest version of Akka with Spark and 
not be locked into the version Spark is using :) I can live with 2 ActorSystem 
instances per node if it means I can use the Akka version I need. Hopefully 
there is a way in the build to scope Spark's Akka version.

> Add ability to pass an existing Akka ActorSystem into Spark
> -----------------------------------------------------------
>
>                 Key: SPARK-2593
>                 URL: https://issues.apache.org/jira/browse/SPARK-2593
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Helena Edelson
>
> As a developer I want to pass an existing ActorSystem into StreamingContext 
> in load-time so that I do not have 2 actor systems running on a node in an 
> Akka application.
> This would mean having spark's actor system on its own named-dispatchers as 
> well as exposing the new private creation of its own actor system.
>   
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to