Here are a few ways to achieve what your loolking to do:

https://github.com/cjnolet/spark-jetty-server

Spark Job Server - https://github.com/spark-jobserver/spark-jobserver -

defines a REST API for Spark

Hue -

http://gethue.com/get-started-with-spark-deploy-spark-server-and-compute-pi-from-your-web-browser/

Spark Kernel project: https://github.com/ibm-et/spark-kernel

> The Spark Kernel's goal is to serve as the foundation for interactive
> applications. The project provides a client library in Scala that abstracts
> connecting to the kernel (containing a Spark Context), which can be
> embedded into a web application. We demonstrated this at StataConf when we
> embedded the Spark Kernel client into a Play application to provide an
> interactive web application that communicates to Spark via the Spark Kernel
> (hosting a SparkContext).


Hopefully one of those will give you what your looking for.

-Todd

On Tue, Mar 31, 2015 at 5:06 PM, Burak Yavuz <brk...@gmail.com> wrote:

> Hi,
>
> If I recall correctly, I've read people integrating REST calls to Spark
> Streaming jobs in the user list. I don't imagine any cases for why it
> shouldn't be possible.
>
> Best,
> Burak
>
> On Tue, Mar 31, 2015 at 1:46 PM, Minnow Noir <minnown...@gmail.com> wrote:
>
>> We have have some data on Hadoop that needs augmented with data only
>> available to us via a REST service.  We're using Spark to search for, and
>> correct, missing data. Even though there are a lot of records to scour for
>> missing data, the total number of calls to the service is expected to be
>> low, so it would be ideal to do the whole job in Spark as we scour the data.
>>
>> I don't see anything obvious in the API or on Google relating to making
>> REST calls from a Spark job.  Is it possible?
>>
>> Thanks,
>>
>> Alec
>>
>
>

Reply via email to