I think you can do simple operations like foreachRDD or transform to get
access to the RDDs in the stream and then you can do SparkSQL over it.

Thanks
Best Regards

On Sat, Feb 28, 2015 at 3:27 PM, Ashish Mukherjee <
ashish.mukher...@gmail.com> wrote:

> Hi,
>
> I have been looking at Spark Streaming , which seems to be for the use
> case of live streams which are processed one line at a time generally in
> real-time.
>
> Since SparkSQL reads data from some filesystem, I was wondering if there
> is something which connects SparkSQL with Spark Streaming, so I can send
> live relational tuples in a stream (rather than read filesystem data) for
> SQL operations.
>
> Also, at present, doing it with Spark Streaming would have complexities of
> handling multiple Dstreams etc. since I may want to run multiple adhoc
> queries of this kind on adhoc data I stream through.
>
> Has anyone done this kind of thing with Spark before? i.e combination of
> SparkSQL with Streaming.
>
> Regards,
> Ashish
>

Reply via email to