[ https://issues.apache.org/jira/browse/SPARK-3215?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123749#comment-14123749 ]
Marcelo Vanzin commented on SPARK-3215: --------------------------------------- I updated the prototype to include a Java API and not to use SparkConf in the API. > Add remote interface for SparkContext > ------------------------------------- > > Key: SPARK-3215 > URL: https://issues.apache.org/jira/browse/SPARK-3215 > Project: Spark > Issue Type: New Feature > Components: Spark Core > Reporter: Marcelo Vanzin > Labels: hive > Attachments: RemoteSparkContext.pdf > > > A quick description of the issue: as part of running Hive jobs on top of > Spark, it's desirable to have a SparkContext that is running in the > background and listening for job requests for a particular user session. > Running multiple contexts in the same JVM is not a very good solution. Not > only SparkContext currently has issues sharing the same JVM among multiple > instances, but that turns the JVM running the contexts into a huge bottleneck > in the system. > So I'm proposing a solution where we have a SparkContext that is running in a > separate process, and listening for requests from the client application via > some RPC interface (most probably Akka). > I'll attach a document shortly with the current proposal. Let's use this bug > to discuss the proposal and any other suggestions. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org