Has anyone else faced this issue of running spark-shell (yarn client mode) in 
an environment with strict firewall rules (on fixed allowed incoming ports)? 
How can this be rectified?

Thanks,
Manish

From: Manish Gupta 8
Sent: Thursday, March 26, 2015 4:09 PM
To: user@spark.apache.org
Subject: Port configuration for BlockManagerId

Hi,

I am running spark-shell and connecting with a yarn cluster with deploy mode as 
"client". In our environment, there are some security policies that doesn't 
allow us to open all TCP port.
Issue I am facing is: Spark Shell driver is using a random port for 
BlockManagerID - BlockManagerId(<driver>, host-name, 52131).

Is there any configuration I can use to fix this random port behavior?

I am running Spark 1.2.0 on CDH 5.3.0.

Thanks,
Manish




Reply via email to