Hi Susan,

Thanks for your response.

Will try configuration as suggested.

But still i am looking for answer does Spark support running multiple jobs
on the same port?

On Sun, May 6, 2018, 20:27 Susan X. Huynh <xhu...@mesosphere.io> wrote:

> Hi Dhaval,
>
> Not sure if you have considered this: the port 4040 sounds like a driver
> UI port. By default it will try up to 4056, but you can increase that
> number with "spark.port.maxRetries". (
> https://spark.apache.org/docs/latest/configuration.html) Try setting it
> to "32". This would help if the only conflict is among the driver UI ports
> (like if you have > 16 drivers running on the same host).
>
> Susan
>
> On Sun, May 6, 2018 at 12:32 AM, vincent gromakowski <
> vincent.gromakow...@gmail.com> wrote:
>
>> Use a scheduler that abstract the network away with a CNI for instance or
>> other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always
>> bind on the same ports because each container will have its own IP. Some
>> other solution like mesos and marathon can work without CNI , with host IP
>> binding, but will manage the ports for you ensuring there isn't any
>> conflict.
>>
>> Le sam. 5 mai 2018 à 17:10, Dhaval Modi <dhavalmod...@gmail.com> a
>> écrit :
>>
>>> Hi All,
>>>
>>> Need advice on executing multiple streaming jobs.
>>>
>>> Problem:- We have 100's of streaming job. Every streaming job uses new
>>> port. Also, Spark automatically checks port from 4040 to 4056, post that it
>>> fails. One of the workaround, is to provide port explicitly.
>>>
>>> Is there a way to tackle this situation? or Am I missing any thing?
>>>
>>> Thanking you in advance.
>>>
>>> Regards,
>>> Dhaval Modi
>>> dhavalmod...@gmail.com
>>>
>>
>
>
> --
> Susan X. Huynh
> Software engineer, Data Agility
> xhu...@mesosphere.com
>

Reply via email to