So I'm using code like this to use specific ports:

val conf = new SparkConf()
    .setMaster(master)
    .setAppName("namexxx")
    .set("spark.driver.port", "51810")
    .set("spark.fileserver.port", "51811")
    .set("spark.broadcast.port", "51812")
    .set("spark.replClassServer.port", "51813")
    .set("spark.blockManager.port", "51814")
    .set("spark.executor.port", "51815")

My question now is : Will the master forward the spark.executor.port
value (to use) to the worker when it hands it a task to do?

Also the property spark.executor.port is different from the Worker
spark port, how can I make the Worker run on a specific port?

Regards

jk


On Wed, May 13, 2015 at 7:51 PM, James King <jakwebin...@gmail.com> wrote:

> Indeed, many thanks.
>
>
> On Wednesday, 13 May 2015, Cody Koeninger <c...@koeninger.org> wrote:
>
>> I believe most ports are configurable at this point, look at
>>
>> http://spark.apache.org/docs/latest/configuration.html
>>
>> search for ".port"
>>
>> On Wed, May 13, 2015 at 9:38 AM, James King <jakwebin...@gmail.com>
>> wrote:
>>
>>> I understated that this port value is randomly selected.
>>>
>>> Is there a way to enforce which spark port a Worker should use?
>>>
>>
>>

Reply via email to