Any ideas guys? What are the best practices for multiple streams to be
processed?
I could trace a few Stack overflow comments wherein they better recommend a
jar separate for each stream / use case. But that isn't pretty much what I
want, as in it's better if one / multiple spark streaming contexts can all
be handled well within a single jar.

Guys please reply,

Awaiting,

Thanks,
Sumit

On Mon, Aug 1, 2016 at 12:24 AM, Sumit Khanna <sumit.kha...@askme.in> wrote:

> Any ideas on this one guys ?
>
> I can do a sample run but can't be sure of imminent problems if any? How
> can I ensure different batchDuration etc etc in here, per StreamingContext.
>
> Thanks,
>
> On Sun, Jul 31, 2016 at 10:50 AM, Sumit Khanna <sumit.kha...@askme.in>
> wrote:
>
>> Hey,
>>
>> Was wondering if I could create multiple spark stream contexts in my
>> application (e.g instantiating a worker actor per topic and it has its own
>> streaming context its own batch duration everything).
>>
>> What are the caveats if any?
>> What are the best practices?
>>
>> Have googled half heartedly on the same but the air isn't pretty much
>> demystified yet. I could skim through something like
>>
>>
>> http://stackoverflow.com/questions/29612726/how-do-you-setup-multiple-spark-streaming-jobs-with-different-batch-durations
>>
>>
>> http://stackoverflow.com/questions/37006565/multiple-spark-streaming-contexts-on-one-worker
>>
>> Thanks in Advance!
>> Sumit
>>
>
>

Reply via email to