Ilya Ostrovskiy created SPARK-13960: ---------------------------------------
Summary: HTTP-based JAR Server doesn't respect spark.driver.host and there is no "spark.fileserver.host" option Key: SPARK-13960 URL: https://issues.apache.org/jira/browse/SPARK-13960 Project: Spark Issue Type: Bug Components: Spark Core, Spark Submit Affects Versions: 1.6.1 Environment: Any system with more than one IP address Reporter: Ilya Ostrovskiy There is no option to specify which hostname/IP address the jar/file server listens on, and rather than using "spark.driver.host" if specified, the jar/file server will listen on the system's primary IP address. This is an issue when submitting an application in client mode on a machine with two NICs connected to two different networks. Steps to reproduce:1) Have a cluster in a remote network, whose master is on 192.168.255.10 2) Have a machine at another location, with a "primary" IP address of "192.168.1.2", connected to the "remote network" as well, with the IP address "192.168.255.250". Let's call this the "client machine". 3) Ensure every machine in the spark cluster at the remote location can ping "192.168.255.250" and reach the client machine via that address. 4) On the client: spark-submit --deploy-mode client --conf "spark.driver.host=192.168.255.250" --master spark://192.168.255.10:7077 --class <any valid spark application> <local jar with spark application> <whatever args you want> 5) Navigate to "http://192.168.255.250:4040/" and ensure that executors from the remote cluster have found the driver on the client machine 6) Navigate to "http://192.168.255.250:4040/environment/", and scroll to the bottom 7) Observe that the JAR you specified in Step 4 will be listed under "http://192.168.1.2:<random port>/jars/<your jar here>.jar" 8) Grok source and documentation to see if there's any way to change that 9) Submit this issue -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org