Got it, thanks....

On Wed, Mar 7, 2018 at 4:32 AM, Gerard Maas <gerard.m...@gmail.com> wrote:

> Hi,
>
> You can run as many jobs in your cluster as you want, provided you have
> enough capacity.
> The one streaming context constrain is per job.
>
> You can submit several jobs for Flume and some other for Twitter, Kafka,
> etc...
>
> If you are getting started with Streaming with Spark, I'd recommend you to
> look into Structured Streaming first.
> In Structured Streaming, you can have many streaming queries running under
> the same spark session.
> Yet, that does not mean you need to put them all in the same job. You can
> (and should) still submit different jobs for different application concerns.
>
> kind regards, Gerard.
>
>
>
> On Wed, Mar 7, 2018 at 4:56 AM, ☼ R Nair (रविशंकर नायर) <
> ravishankar.n...@gmail.com> wrote:
>
>> Hi all,
>>
>> Understand from documentation that, only one streaming context can be
>> active in a JVM at the same time.
>>
>> Hence in an enterprise cluster, how can we manage/handle multiple users
>> are having many different streaming applications, one may be ingesting data
>> from Flume, another from Twitter etc? Is this not available now?
>>
>> Best,
>> Passion
>>
>
>

Reply via email to