[ https://issues.apache.org/jira/browse/SPARK-3215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin updated SPARK-3215: ---------------------------------- Attachment: RemoteSparkContext.pdf Initial proposal for a remote context interface. Note that this is not a formal design document, just a high-level proposal, so it doesn't go deeply into what APIs would be exposed on anything like that. > Add remote interface for SparkContext > ------------------------------------- > > Key: SPARK-3215 > URL: https://issues.apache.org/jira/browse/SPARK-3215 > Project: Spark > Issue Type: New Feature > Components: Spark Core > Reporter: Marcelo Vanzin > Labels: hive > Attachments: RemoteSparkContext.pdf > > > A quick description of the issue: as part of running Hive jobs on top of > Spark, it's desirable to have a SparkContext that is running in the > background and listening for job requests for a particular user session. > Running multiple contexts in the same JVM is not a very good solution. Not > only SparkContext currently has issues sharing the same JVM among multiple > instances, but that turns the JVM running the contexts into a huge bottleneck > in the system. > So I'm proposing a solution where we have a SparkContext that is running in a > separate process, and listening for requests from the client application via > some RPC interface (most probably Akka). > I'll attach a document shortly with the current proposal. Let's use this bug > to discuss the proposal and any other suggestions. -- This message was sent by Atlassian JIRA (v6.2#6252) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org