Hi,

For security reasons, we added a server between my aws Spark Cluster and
local, so I couldn't connect to the cluster directly. To see the SparkUI and
its related work's  stdout and stderr, I used dynamic forwarding and
configured the SOCKS proxy. Now I could see the SparkUI using the  internal
ec2 ip, however when I click on the application UI (4040) or the worker's UI
(8081), it still automatically uses the public DNS instead of internal ec2
ip, which the browser now couldn't show. 

Is there a way that I could configure this? I saw that one could configure
the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
help. Does anyone experience the same issue?

Thanks a lot!
Anny




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to