Marcelo Vanzin created SPARK-3215:
-------------------------------------

             Summary: Add remote interface for SparkContext
                 Key: SPARK-3215
                 URL: https://issues.apache.org/jira/browse/SPARK-3215
             Project: Spark
          Issue Type: New Feature
          Components: Spark Core
            Reporter: Marcelo Vanzin


A quick description of the issue: as part of running Hive jobs on top of Spark, 
it's desirable to have a SparkContext that is running in the background and 
listening for job requests for a particular user session.

Running multiple contexts in the same JVM is not a very good solution. Not only 
SparkContext currently has issues sharing the same JVM among multiple 
instances, but that turns the JVM running the contexts into a huge bottleneck 
in the system.

So I'm proposing a solution where we have a SparkContext that is running in a 
separate process, and listening for requests from the client application via 
some RPC interface (most probably Akka).

I'll attach a document shortly with the current proposal. Let's use this bug to 
discuss the proposal and any other suggestions.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to