Re: How to configure SparkUI to use internal ec2 ip

2015-03-31 Thread Akhil Das
You can add an internal ip to public hostname mapping in your /etc/hosts
file, if your forwarding is proper then it wouldn't be a problem there
after.



Thanks
Best Regards

On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com wrote:

 Hi,

 For security reasons, we added a server between my aws Spark Cluster and
 local, so I couldn't connect to the cluster directly. To see the SparkUI
 and
 its related work's  stdout and stderr, I used dynamic forwarding and
 configured the SOCKS proxy. Now I could see the SparkUI using the  internal
 ec2 ip, however when I click on the application UI (4040) or the worker's
 UI
 (8081), it still automatically uses the public DNS instead of internal ec2
 ip, which the browser now couldn't show.

 Is there a way that I could configure this? I saw that one could configure
 the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
 help. Does anyone experience the same issue?

 Thanks a lot!
 Anny




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: How to configure SparkUI to use internal ec2 ip

2015-03-31 Thread Anny Chen
Hi Akhil,

I tried editing the /etc/hosts on the master and on the workers, and seems
it is not working for me.

I tried adding hostname internal-ip and it didn't work. I then tried
adding internal-ip hostname and it didn't work either. I guess I should
also edit the spark-env.sh file?

Thanks!
Anny

On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 You can add an internal ip to public hostname mapping in your /etc/hosts
 file, if your forwarding is proper then it wouldn't be a problem there
 after.



 Thanks
 Best Regards

 On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com wrote:

 Hi,

 For security reasons, we added a server between my aws Spark Cluster and
 local, so I couldn't connect to the cluster directly. To see the SparkUI
 and
 its related work's  stdout and stderr, I used dynamic forwarding and
 configured the SOCKS proxy. Now I could see the SparkUI using the
 internal
 ec2 ip, however when I click on the application UI (4040) or the worker's
 UI
 (8081), it still automatically uses the public DNS instead of internal ec2
 ip, which the browser now couldn't show.

 Is there a way that I could configure this? I saw that one could configure
 the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
 help. Does anyone experience the same issue?

 Thanks a lot!
 Anny




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Re: How to configure SparkUI to use internal ec2 ip

2015-03-31 Thread Petar Zecevic


Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?


On 31.3.2015. 19:19, Anny Chen wrote:

Hi Akhil,

I tried editing the /etc/hosts on the master and on the workers, and 
seems it is not working for me.


I tried adding hostname internal-ip and it didn't work. I then 
tried adding internal-ip hostname and it didn't work either. I 
guess I should also edit the spark-env.sh file?


Thanks!
Anny

On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das 
ak...@sigmoidanalytics.com mailto:ak...@sigmoidanalytics.com wrote:


You can add an internal ip to public hostname mapping in your
/etc/hosts file, if your forwarding is proper then it wouldn't be
a problem there after.



Thanks
Best Regards

On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com
mailto:anny9...@gmail.com wrote:

Hi,

For security reasons, we added a server between my aws Spark
Cluster and
local, so I couldn't connect to the cluster directly. To see
the SparkUI and
its related work's  stdout and stderr, I used dynamic
forwarding and
configured the SOCKS proxy. Now I could see the SparkUI using
the  internal
ec2 ip, however when I click on the application UI (4040) or
the worker's UI
(8081), it still automatically uses the public DNS instead of
internal ec2
ip, which the browser now couldn't show.

Is there a way that I could configure this? I saw that one
could configure
the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether
this could
help. Does anyone experience the same issue?

Thanks a lot!
Anny




--
View this message in context:

http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
Sent from the Apache Spark User List mailing list archive at
Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
mailto:user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
mailto:user-h...@spark.apache.org







Re: How to configure SparkUI to use internal ec2 ip

2015-03-31 Thread Akhil Das
When you say you added internal-ip hostname, where you able to ping any
of these from the machine?

You could try setting SPARK_LOCAL_IP on all machines. But make sure you
will be able to bind to that host/ip specified there.


Thanks
Best Regards

On Tue, Mar 31, 2015 at 10:49 PM, Anny Chen anny9...@gmail.com wrote:

 Hi Akhil,

 I tried editing the /etc/hosts on the master and on the workers, and seems
 it is not working for me.

 I tried adding hostname internal-ip and it didn't work. I then tried
 adding internal-ip hostname and it didn't work either. I guess I should
 also edit the spark-env.sh file?

 Thanks!
 Anny

 On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

 You can add an internal ip to public hostname mapping in your /etc/hosts
 file, if your forwarding is proper then it wouldn't be a problem there
 after.



 Thanks
 Best Regards

 On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com wrote:

 Hi,

 For security reasons, we added a server between my aws Spark Cluster and
 local, so I couldn't connect to the cluster directly. To see the SparkUI
 and
 its related work's  stdout and stderr, I used dynamic forwarding and
 configured the SOCKS proxy. Now I could see the SparkUI using the
 internal
 ec2 ip, however when I click on the application UI (4040) or the
 worker's UI
 (8081), it still automatically uses the public DNS instead of internal
 ec2
 ip, which the browser now couldn't show.

 Is there a way that I could configure this? I saw that one could
 configure
 the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
 help. Does anyone experience the same issue?

 Thanks a lot!
 Anny




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org






Re: How to configure SparkUI to use internal ec2 ip

2015-03-31 Thread Anny Chen
Thanks Petar  and Akhil for the suggestion.

Actually I changed the SPARK_MASTER_IP to the internal-ip, deleted the
export SPARK_PUBLIC_DNS=xx line in the spark-env.sh and also edited
the /etc/hosts as Akhil suggested, and now it is working! However I don't
know which change actually makes it work.

Thanks!
Anny

On Tue, Mar 31, 2015 at 10:26 AM, Petar Zecevic petar.zece...@gmail.com
wrote:


 Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?



 On 31.3.2015. 19:19, Anny Chen wrote:

 Hi Akhil,

  I tried editing the /etc/hosts on the master and on the workers, and
 seems it is not working for me.

  I tried adding hostname internal-ip and it didn't work. I then tried
 adding internal-ip hostname and it didn't work either. I guess I should
 also edit the spark-env.sh file?

  Thanks!
 Anny

 On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das ak...@sigmoidanalytics.com
 wrote:

  You can add an internal ip to public hostname mapping in your
 /etc/hosts file, if your forwarding is proper then it wouldn't be a problem
 there after.



  Thanks
 Best Regards

 On Tue, Mar 31, 2015 at 9:18 AM, anny9699 anny9...@gmail.com wrote:

 Hi,

 For security reasons, we added a server between my aws Spark Cluster and
 local, so I couldn't connect to the cluster directly. To see the SparkUI
 and
 its related work's  stdout and stderr, I used dynamic forwarding and
 configured the SOCKS proxy. Now I could see the SparkUI using the
 internal
 ec2 ip, however when I click on the application UI (4040) or the
 worker's UI
 (8081), it still automatically uses the public DNS instead of internal
 ec2
 ip, which the browser now couldn't show.

 Is there a way that I could configure this? I saw that one could
 configure
 the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
 help. Does anyone experience the same issue?

 Thanks a lot!
 Anny




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org







How to configure SparkUI to use internal ec2 ip

2015-03-30 Thread anny9699
Hi,

For security reasons, we added a server between my aws Spark Cluster and
local, so I couldn't connect to the cluster directly. To see the SparkUI and
its related work's  stdout and stderr, I used dynamic forwarding and
configured the SOCKS proxy. Now I could see the SparkUI using the  internal
ec2 ip, however when I click on the application UI (4040) or the worker's UI
(8081), it still automatically uses the public DNS instead of internal ec2
ip, which the browser now couldn't show. 

Is there a way that I could configure this? I saw that one could configure
the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
help. Does anyone experience the same issue?

Thanks a lot!
Anny




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org