Re: Advice on multiple streaming job

2018-05-08 Thread Peter Liu
Hi Dhaval, I'm using Yarn scheduler (without the need to specify the port in the submit). Not sue why the port issue here. Gerard seem to have a good point here to have the multiple topics managed within your application (to avoid the port issue) - Not sure if you're using Spark Streaming or

Re: Advice on multiple streaming job

2018-05-07 Thread Dhaval Modi
Hi Gerard, Our source is kafka, and we are using standard streaming api (DStreams). Our requirement is, as we have 100's of kafka topics, Each topic sends different messages in JSON (complex) format. Topics structured are as per domain. Hence, each topic is independent of each other. These JSON

Re: Advice on multiple streaming job

2018-05-07 Thread Gerard Maas
Dhaval, Which Streaming API are you using? In Structured Streaming, you are able to start several streaming queries within the same context. kind regards, Gerard. On Sun, May 6, 2018 at 7:59 PM, Dhaval Modi wrote: > Hi Susan, > > Thanks for your response. > > Will try

Re: Advice on multiple streaming job

2018-05-06 Thread Dhaval Modi
Hi Susan, Thanks for your response. Will try configuration as suggested. But still i am looking for answer does Spark support running multiple jobs on the same port? On Sun, May 6, 2018, 20:27 Susan X. Huynh wrote: > Hi Dhaval, > > Not sure if you have considered this:

Re: Advice on multiple streaming job

2018-05-06 Thread Dhaval Modi
Hi vincent, Thanks for your response. We are using YARN, and CNI may not be possible. Thanks & Regards, Dhaval On Sun, May 6, 2018, 13:02 vincent gromakowski < vincent.gromakow...@gmail.com> wrote: > Use a scheduler that abstract the network away with a CNI for instance or > other mécanismes

Re: Advice on multiple streaming job

2018-05-06 Thread Susan X. Huynh
Hi Dhaval, Not sure if you have considered this: the port 4040 sounds like a driver UI port. By default it will try up to 4056, but you can increase that number with "spark.port.maxRetries". ( https://spark.apache.org/docs/latest/configuration.html) Try setting it to "32". This would help if the

Re: Advice on multiple streaming job

2018-05-06 Thread vincent gromakowski
Use a scheduler that abstract the network away with a CNI for instance or other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always bind on the same ports because each container will have its own IP. Some other solution like mesos and marathon can work without CNI , with host IP

Advice on multiple streaming job

2018-05-05 Thread Dhaval Modi
Hi All, Need advice on executing multiple streaming jobs. Problem:- We have 100's of streaming job. Every streaming job uses new port. Also, Spark automatically checks port from 4040 to 4056, post that it fails. One of the workaround, is to provide port explicitly. Is there a way to tackle this