t be some rdd in spark or some external
>> file ?
>>
>>
>>
>> Thanks,
>>
>> Udbhav
>>
>> *From:* ayan guha [mailto:guha.a...@gmail.com]
>> *Sent:* Friday, September 16, 2016 3:01 AM
>> *To:* Udbhav Agarwal <udbhav.agar...@syncoms.com>
dbhav Agarwal <udbhav.agar...@syncoms.com>
> *Cc:* user <user@spark.apache.org>
> *Subject:* RE: Spark processing Multiple Streams from a single stream
>
>
>
> You may consider writing back to Kafka from main stream and then have
> downstream consumers.
> This
Agarwal <udbhav.agar...@syncoms.com>
Cc: user <user@spark.apache.org>
Subject: RE: Spark processing Multiple Streams from a single stream
You may consider writing back to Kafka from main stream and then have
downstream consumers.
This will keep things modular and independent.
On 15 Se
gt;
> Udbhav
>
> *From:* ayan guha [mailto:guha.a...@gmail.com]
> *Sent:* Thursday, September 15, 2016 6:43 PM
> *To:* Udbhav Agarwal <udbhav.agar...@syncoms.com>
> *Cc:* user <user@spark.apache.org>
> *Subject:* Re: Spark processing Multiple Streams from a single stream
?
Thanks,
Udbhav
From: ayan guha [mailto:guha.a...@gmail.com]
Sent: Thursday, September 15, 2016 6:43 PM
To: Udbhav Agarwal <udbhav.agar...@syncoms.com>
Cc: user <user@spark.apache.org>
Subject: Re: Spark processing Multiple Streams from a single stream
Depending on source. For exampl
Depending on source. For example, if source is Kafka then you can write 4
streaming consumers.
On 15 Sep 2016 20:11, "Udbhav Agarwal" wrote:
> Hi All,
>
> I have a scenario where I want to process a message in various ways in
> parallel. For instance a message is