There may be firewall rules limiting the ports between host running spark
and the hadoop cluster. In that case, not all ports are allowed.

Can it be a range of ports that can be specified ?

On Wed, Mar 25, 2015 at 4:06 PM, Shixiong Zhu <zsxw...@gmail.com> wrote:

> It's a random port to avoid port conflicts, since multiple AMs can run in
> the same machine. Why do you need a fixed port?
>
> Best Regards,
> Shixiong Zhu
>
> 2015-03-26 6:49 GMT+08:00 Manoj Samel <manojsamelt...@gmail.com>:
>
>> Spark 1.3, Hadoop 2.5, Kerbeors
>>
>> When running spark-shell in yarn client mode, it shows following message
>> with a random port every time (44071 in example below). Is there a way to
>> specify that port to a specific port ? It does not seem to be part of ports
>> specified in http://spark.apache.org/docs/latest/configuration.html
>> spark.xxx.port ...
>>
>> Thanks,
>>
>> 15/03/25 22:27:10 INFO Client: Application report for
>> application_1427316153428_0014 (state: ACCEPTED)
>> 15/03/25 22:27:10 INFO YarnClientSchedulerBackend: ApplicationMaster
>> registered as Actor[akka.tcp://sparkYarnAM@xyz
>> :44071/user/YarnAM#-1989273896]
>>
>
>

Reply via email to