Re: Spark StreamingContext Question

2018-03-07 Thread रविशंकर नायर
Got it, thanks On Wed, Mar 7, 2018 at 4:32 AM, Gerard Maas wrote: > Hi, > > You can run as many jobs in your cluster as you want, provided you have > enough capacity. > The one streaming context constrain is per job. > > You can submit several jobs for Flume and some

Re: Spark StreamingContext Question

2018-03-07 Thread Gerard Maas
Hi, You can run as many jobs in your cluster as you want, provided you have enough capacity. The one streaming context constrain is per job. You can submit several jobs for Flume and some other for Twitter, Kafka, etc... If you are getting started with Streaming with Spark, I'd recommend you to

Re: Spark StreamingContext Question

2018-03-07 Thread sagar grover
Hi, You can have multiple streams under same streaming context and process them accordingly. With regards, Sagar Grover Phone - 7022175584 On Wed, Mar 7, 2018 at 9:26 AM, ☼ R Nair (रविशंकर नायर) < ravishankar.n...@gmail.com> wrote: > Hi all, > > Understand from documentation that, only one

Spark StreamingContext Question

2018-03-06 Thread रविशंकर नायर
Hi all, Understand from documentation that, only one streaming context can be active in a JVM at the same time. Hence in an enterprise cluster, how can we manage/handle multiple users are having many different streaming applications, one may be ingesting data from Flume, another from Twitter