[ 
https://issues.apache.org/jira/browse/SPARK-4563?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15861453#comment-15861453
 ] 

Danny Robinson commented on SPARK-4563:
---------------------------------------

I found one completely hacked way to allow my Spark Driver in Docker connecting 
to non-docker Spark to work.  This is Spark 1.6.2.

export SPARK_PUBLIC_DNS=IPADDR_OF_DOCKER_HOST_OR_PROXY
export SPARK_LOCAL_IP=IPADDR_OF_DOCKER_HOST_OR_PROXY

at container startup I do this:
echo -e "`hostname -i` `hostname` ${HOSTNAME_OF_DOCKER_HOST_OR_PROXY}" >> 
/etc/hosts

Essentially, the exports seem to control the IP that Spark UI & BlockManager 
recognize
The hosts file hack allows the spark driver to resolve the external hostname as 
if it was a local hostname, and therefore it knows which interface card to 
listen on, and then uses the hostname in the connection info it sends to the 
executors.  When the executors connect back, they are obviously resolving the 
hostname to the correct external IP.

Reason I say HOST or PROXY is that I run haproxy as a docker load-balancer at 
the front of my swarm.  That ensures i never have to worry exactly which node 
is running the Spark driver, all traffic routes via haproxy.

Agree with many here though, this is crazy complicated and inconsistent.


> Allow spark driver to bind to different ip then advertise ip
> ------------------------------------------------------------
>
>                 Key: SPARK-4563
>                 URL: https://issues.apache.org/jira/browse/SPARK-4563
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>            Reporter: Long Nguyen
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>             Fix For: 2.1.0
>
>
> Spark driver bind ip and advertise is not configurable. spark.driver.host is 
> only bind ip. SPARK_PUBLIC_DNS does not work for spark driver. Allow option 
> to set advertised ip/hostname



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to