It sounds like when you start up spark, its using 0.0.0.0 which means it will
listen on all interfaces.
You should be able to limit which interface to use.
The weird thing is that if you are specifying the IP Address and Port, Spark
shouldn’t be listening on all of the interfaces for that
Thanks much, Akhil. iptables is certainly a bandaid, but from an OpSec
perspective, it's troubling.
Is there any way to limit which interfaces the WebUI listens on? Is there a
Jetty configuration that I'm missing?
Thanks again for your help,
David
On Wed, Mar 30, 2016 at 2:25 AM, Akhil Das
In your case, you will be able to see the webui (unless restricted with
iptables) but you won't be able to submit jobs to that machine from a
remote machine since the spark master is spark://127.0.0.1:7077
Thanks
Best Regards
On Tue, Mar 29, 2016 at 8:12 PM, David O'Gwynn
/etc/hosts
127.0.0.1 localhost
conf/slaves
127.0.0.1
On Mon, Mar 28, 2016 at 5:36 PM, Mich Talebzadeh
wrote:
> in your /etc/hosts what do you have for localhost
>
> 127.0.0.1 localhost.localdomain localhost
>
> conf/slave should have one entry in your case
>
> cat
in your /etc/hosts what do you have for localhost
127.0.0.1 localhost.localdomain localhost
conf/slave should have one entry in your case
cat slaves
# A Spark Worker will be started on each of the machines listed below.
localhost
...
Dr Mich Talebzadeh
LinkedIn *
Greetings to all,
I've search around the mailing list, but it would seem that (nearly?)
everyone has the opposite problem as mine. I made a stab at looking in the
source for an answer, but I figured I might as well see if anyone else has
run into the same problem as I.
I'm trying to limit my