If you want to design something like Spark shell have a look at:

http://zeppelin-project.org/

Its open source and may already do what you need. If not, its source code
will be helpful in answering the questions about how to integrate with long
running jobs that you have.

On Thu Feb 05 2015 at 11:42:56 AM Boromir Widas <vcsub...@gmail.com> wrote:

> You can check out https://github.com/spark-jobserver/spark-jobserver -
> this allows several users to upload their jars and run jobs with a REST
> interface.
>
> However, if all users are using the same functionality, you can write a
> simple spray server which will act as the driver and hosts the spark
> context+RDDs, launched in client mode.
>
> On Thu, Feb 5, 2015 at 10:25 AM, Shuai Zheng <szheng.c...@gmail.com>
> wrote:
>
>> Hi All,
>>
>>
>>
>> I want to develop a server side application:
>>
>>
>>
>> User submit request à Server run spark application and return (this
>> might take a few seconds).
>>
>>
>>
>> So I want to host the server to keep the long-live context, I don’t know
>> whether this is reasonable or not.
>>
>>
>>
>> Basically I try to have a global JavaSparkContext instance and keep it
>> there, and initialize some RDD. Then my java application will use it to
>> submit the job.
>>
>>
>>
>> So now I have some questions:
>>
>>
>>
>> 1, if I don’t close it, will there any timeout I need to configure on the
>> spark server?
>>
>> 2, In theory I want to design something similar to Spark shell (which
>> also host a default sc there), just it is not shell based.
>>
>>
>>
>> Any suggestion? I think my request is very common for application
>> development, here must someone has done it before?
>>
>>
>>
>> Regards,
>>
>>
>>
>> Shawn
>>
>
>

Reply via email to