Hi,

You can also check out the Spark Kernel project:
https://github.com/ibm-et/spark-kernel

It can plug into the upcoming IPython 3.0 notebook (providing a Scala/Spark
language interface) and provides an API to submit code snippets (like the
Spark Shell) and get results directly back, rather than having to write out
your results elsewhere. A client library (
https://github.com/ibm-et/spark-kernel/wiki/Guide-for-the-Spark-Kernel-Client)
is available in Scala so you can create applications that can interactively
communicate with Apache Spark.

You can find a getting started section here:
https://github.com/ibm-et/spark-kernel/wiki/Getting-Started-with-the-Spark-Kernel

If you have any more questions about the project, feel free to email me!

Signed,
Chip Senkbeil

On Thu Feb 05 2015 at 10:58:01 AM Corey Nolet <cjno...@gmail.com> wrote:

> Here's another lightweight example of running a SparkContext in a common
> java servlet container: https://github.com/calrissian/spark-jetty-server
>
> On Thu, Feb 5, 2015 at 11:46 AM, Charles Feduke <charles.fed...@gmail.com>
> wrote:
>
>> If you want to design something like Spark shell have a look at:
>>
>> http://zeppelin-project.org/
>>
>> Its open source and may already do what you need. If not, its source code
>> will be helpful in answering the questions about how to integrate with long
>> running jobs that you have.
>>
>>
>> On Thu Feb 05 2015 at 11:42:56 AM Boromir Widas <vcsub...@gmail.com>
>> wrote:
>>
>>> You can check out https://github.com/spark-jobserver/spark-jobserver -
>>> this allows several users to upload their jars and run jobs with a REST
>>> interface.
>>>
>>> However, if all users are using the same functionality, you can write a
>>> simple spray server which will act as the driver and hosts the spark
>>> context+RDDs, launched in client mode.
>>>
>>> On Thu, Feb 5, 2015 at 10:25 AM, Shuai Zheng <szheng.c...@gmail.com>
>>> wrote:
>>>
>>>> Hi All,
>>>>
>>>>
>>>>
>>>> I want to develop a server side application:
>>>>
>>>>
>>>>
>>>> User submit request à Server run spark application and return (this
>>>> might take a few seconds).
>>>>
>>>>
>>>>
>>>> So I want to host the server to keep the long-live context, I don’t
>>>> know whether this is reasonable or not.
>>>>
>>>>
>>>>
>>>> Basically I try to have a global JavaSparkContext instance and keep it
>>>> there, and initialize some RDD. Then my java application will use it to
>>>> submit the job.
>>>>
>>>>
>>>>
>>>> So now I have some questions:
>>>>
>>>>
>>>>
>>>> 1, if I don’t close it, will there any timeout I need to configure on
>>>> the spark server?
>>>>
>>>> 2, In theory I want to design something similar to Spark shell (which
>>>> also host a default sc there), just it is not shell based.
>>>>
>>>>
>>>>
>>>> Any suggestion? I think my request is very common for application
>>>> development, here must someone has done it before?
>>>>
>>>>
>>>>
>>>> Regards,
>>>>
>>>>
>>>>
>>>> Shawn
>>>>
>>>
>>>
>

Reply via email to