@spark.apache.org
Subject: Re: How to design a long live spark application
Yes you can submit multiple actions from different threads to the same
SparkContext. It is safe.
Indeed what you want to achieve is quite common. Expose some operations over a
SparkContext through HTTP.
I have used spray
Hi All,
I want to develop a server side application:
User submit request à Server run spark application and return (this might
take a few seconds).
So I want to host the server to keep the long-live context, I dont know
whether this is reasonable or not.
Basically I try to have a
...@gmail.com]
*Sent:* Thursday, February 05, 2015 11:55 AM
*To:* Charles Feduke
*Cc:* Shuai Zheng; user@spark.apache.org
*Subject:* Re: How to design a long live spark application
Here's another lightweight example of running a SparkContext in a common
java servlet container: https://github.com
@spark.apache.org
Subject: Re: How to design a long live spark application
Here's another lightweight example of running a SparkContext in a common java
servlet container: https://github.com/calrissian/spark-jetty-server
On Thu, Feb 5, 2015 at 11:46 AM, Charles Feduke charles.fed...@gmail.com
If you want to design something like Spark shell have a look at:
http://zeppelin-project.org/
Its open source and may already do what you need. If not, its source code
will be helpful in answering the questions about how to integrate with long
running jobs that you have.
On Thu Feb 05 2015 at
Here's another lightweight example of running a SparkContext in a common
java servlet container: https://github.com/calrissian/spark-jetty-server
On Thu, Feb 5, 2015 at 11:46 AM, Charles Feduke charles.fed...@gmail.com
wrote:
If you want to design something like Spark shell have a look at:
Hi,
You can also check out the Spark Kernel project:
https://github.com/ibm-et/spark-kernel
It can plug into the upcoming IPython 3.0 notebook (providing a Scala/Spark
language interface) and provides an API to submit code snippets (like the
Spark Shell) and get results directly back, rather