Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?


On 31.3.2015. 19:19, Anny Chen wrote:
Hi Akhil,

I tried editing the /etc/hosts on the master and on the workers, and seems it is not working for me.

I tried adding <hostname> <internal-ip> and it didn't work. I then tried adding <internal-ip> <hostname> and it didn't work either. I guess I should also edit the spark-env.sh file?

Thanks!
Anny

On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das <ak...@sigmoidanalytics.com <mailto:ak...@sigmoidanalytics.com>> wrote:

    You can add an internal ip to public hostname mapping in your
    /etc/hosts file, if your forwarding is proper then it wouldn't be
    a problem there after.



    Thanks
    Best Regards

    On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <anny9...@gmail.com
    <mailto:anny9...@gmail.com>> wrote:

        Hi,

        For security reasons, we added a server between my aws Spark
        Cluster and
        local, so I couldn't connect to the cluster directly. To see
        the SparkUI and
        its related work's  stdout and stderr, I used dynamic
        forwarding and
        configured the SOCKS proxy. Now I could see the SparkUI using
        the  internal
        ec2 ip, however when I click on the application UI (4040) or
        the worker's UI
        (8081), it still automatically uses the public DNS instead of
        internal ec2
        ip, which the browser now couldn't show.

        Is there a way that I could configure this? I saw that one
        could configure
        the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether
        this could
        help. Does anyone experience the same issue?

        Thanks a lot!
        Anny




        --
        View this message in context:
        
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
        Sent from the Apache Spark User List mailing list archive at
        Nabble.com.

        ---------------------------------------------------------------------
        To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
        <mailto:user-unsubscr...@spark.apache.org>
        For additional commands, e-mail: user-h...@spark.apache.org
        <mailto:user-h...@spark.apache.org>




Reply via email to