Easiest is to use a queue, Kafka for example. So push your json request
string into kafka,
connect spark streaming to kafka & pull data from it & execute it.
Spark streaming will split up the jobs & pipeline the data.

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Thu, Mar 6, 2014 at 6:24 PM, sonyjv <sonyjvech...@yahoo.com> wrote:

> Thanks Mayur for your response.
>
> I think I need to clarify the first part of my query. The JSON based REST
> API will be called by external interfaces. These requests needs to be
> processed in a streaming mode in Spark. I am not clear about the following
> points
>
> 1. How can JSON request string (50 per sec) be continuously streamed to
> Spark.
> 2. The processing of the request in Spark will not last long. But would
> require to be split into multiple steps to render fast initial response. So
> for coordinating the Spark jobs do I have to use Kafka or any other queues.
> Or can I directly stream from one job to another.
>
> Regards,
> Sony
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Streaming-JSON-string-from-REST-Api-in-Spring-tp2358p2383.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to