[ 
https://issues.apache.org/jira/browse/SPARK-9279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15864595#comment-15864595
 ] 

Julian Gamble commented on SPARK-9279:
--------------------------------------

This is a serious issue in Cloud/Docker environments when trying to look at 
Spark workers through a proxy (like Fabio) that defaults to 80. You need the 
worker to publish its port on 80 (regardless of what port the docker host 
opens) so that its published port is 80 for the proxy. (You can work around 
this with Ambassadors - but other applications don't force you to do this.)

> Spark Master Refuses to Bind WebUI to a Privileged Port
> -------------------------------------------------------
>
>                 Key: SPARK-9279
>                 URL: https://issues.apache.org/jira/browse/SPARK-9279
>             Project: Spark
>          Issue Type: Improvement
>          Components: Web UI
>    Affects Versions: 1.4.1
>         Environment: Ubuntu Trusty running in a docker container
>            Reporter: Omar Padron
>            Priority: Minor
>
> When trying to start a spark master server as root...
> {code}
> export SPARK_MASTER_PORT=7077
> export SPARK_MASTER_WEBUI_PORT=80
> spark-class org.apache.spark.deploy.master.Master \
>     --host "$( hostname )" \
>     --port "$SPARK_MASTER_PORT" \
>     --webui-port "$SPARK_MASTER_WEBUI_PORT"
> {code}
> The process terminates with IllegalArgumentException "requirement failed: 
> startPort should be between 1024 and 65535 (inclusive), or 0 for a random 
> free port."
> But, when SPARK_MASTER_WEBUI_PORT=8080 (or anything >1024), the process runs 
> fine.
> I do not understand why the usable ports have been arbitrarily restricted to 
> the non-privileged.  Users choosing to run spark as root should be allowed to 
> choose their own ports.
> Full output from a sample run below:
> {code}
> 2015-07-23 14:36:50,892 INFO  [main] master.Master 
> (SignalLogger.scala:register(47)) - Registered signal handlers for [TERM, 
> HUP, INT]
> 2015-07-23 14:36:51,399 WARN  [main] util.NativeCodeLoader 
> (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 2015-07-23 14:36:51,586 INFO  [main] spark.SecurityManager 
> (Logging.scala:logInfo(59)) - Changing view acls to: root
> 2015-07-23 14:36:51,587 INFO  [main] spark.SecurityManager 
> (Logging.scala:logInfo(59)) - Changing modify acls to: root
> 2015-07-23 14:36:51,588 INFO  [main] spark.SecurityManager 
> (Logging.scala:logInfo(59)) - SecurityManager: authentication disabled; ui 
> acls disabled; users with view permissions: Set(root); users with modify 
> permissions: Set(root)
> 2015-07-23 14:36:52,295 INFO  [sparkMaster-akka.actor.default-dispatcher-2] 
> slf4j.Slf4jLogger (Slf4jLogger.scala:applyOrElse(80)) - Slf4jLogger started
> 2015-07-23 14:36:52,349 INFO  [sparkMaster-akka.actor.default-dispatcher-2] 
> Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Starting remoting
> 2015-07-23 14:36:52,489 INFO  [sparkMaster-akka.actor.default-dispatcher-2] 
> Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Remoting started; listening 
> on addresses :[akka.tcp://sparkMaster@sparkmaster:7077]
> 2015-07-23 14:36:52,497 INFO  [main] util.Utils (Logging.scala:logInfo(59)) - 
> Successfully started service 'sparkMaster' on port 7077.
> 2015-07-23 14:36:52,717 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> server.Server (Server.java:doStart(272)) - jetty-8.y.z-SNAPSHOT
> 2015-07-23 14:36:52,759 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started 
> SelectChannelConnector@sparkmaster:6066
> 2015-07-23 14:36:52,759 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> util.Utils (Logging.scala:logInfo(59)) - Successfully started service on port 
> 6066.
> 2015-07-23 14:36:52,760 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> rest.StandaloneRestServer (Logging.scala:logInfo(59)) - Started REST server 
> for submitting applications on port 6066
> 2015-07-23 14:36:52,765 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> master.Master (Logging.scala:logInfo(59)) - Starting Spark master at 
> spark://sparkmaster:7077
> 2015-07-23 14:36:52,766 INFO  [sparkMaster-akka.actor.default-dispatcher-4] 
> master.Master (Logging.scala:logInfo(59)) - Running Spark version 1.4.1
> 2015-07-23 14:36:52,772 ERROR [sparkMaster-akka.actor.default-dispatcher-4] 
> ui.MasterWebUI (Logging.scala:logError(96)) - Failed to bind MasterWebUI
> java.lang.IllegalArgumentException: requirement failed: startPort should be 
> between 1024 and 65535 (inclusive), or 0 for a random free port.
>         at scala.Predef$.require(Predef.scala:233)
>         at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1977)
>         at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:238)
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:117)
>         at org.apache.spark.deploy.master.Master.preStart(Master.scala:144)
>         at akka.actor.Actor$class.aroundPreStart(Actor.scala:470)
>         at 
> org.apache.spark.deploy.master.Master.aroundPreStart(Master.scala:52)
>         at akka.actor.ActorCell.create(ActorCell.scala:580)
>         at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
>         at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>         at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>         at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>         at 
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>         at 
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>         at 
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>         at 
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 2015-07-23 14:36:52,778 INFO  [Thread-1] util.Utils 
> (Logging.scala:logInfo(59)) - Shutdown hook called
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to