Hi,

I am running spark-shell and connecting with a yarn cluster with deploy mode as 
"client". In our environment, there are some security policies that doesn't 
allow us to open all TCP port.
Issue I am facing is: Spark Shell driver is using a random port for 
BlockManagerID - BlockManagerId(<driver>, host-name, 52131).

Is there any configuration I can use to fix this random port behavior?

I am running Spark 1.2.0 on CDH 5.3.0.

Thanks,
Manish




Reply via email to