[ 
https://issues.apache.org/jira/browse/SPARK-11909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15021669#comment-15021669
 ] 

Saisai Shao commented on SPARK-11909:
-------------------------------------

The master will print the master url in web UI and log. Since master is a 
daemon process, it is not so good to print in the console.

Also as [~srowen] suggested, it is better for user to explicitly specify the 
port number, this port is also used to differ whether you're submitting Spark 
application using binary protocol (7077) or REST (6066), if it can be ignored, 
it is hard for Spark itself to decide which port is the right port you want to 
submit to.

> Spark Standalone's master URL accepts URLs without port (assuming default 
> 7077)
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-11909
>                 URL: https://issues.apache.org/jira/browse/SPARK-11909
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Jacek Laskowski
>            Priority: Trivial
>
> It's currently impossible to use {{spark://localhost}} URL for Spark 
> Standalone's master. With the feature supported, it'd be less to know to get 
> started with the mode (and hence improve user friendliness).
> I think no-port master URL should be supported and assume the default port 
> {{7077}}.
> {code}
> org.apache.spark.SparkException: Invalid master URL: spark://localhost
>       at 
> org.apache.spark.util.Utils$.extractHostPortFromSparkUrl(Utils.scala:2088)
>       at org.apache.spark.rpc.RpcAddress$.fromSparkURL(RpcAddress.scala:47)
>       at 
> org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48)
>       at 
> org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>       at 
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>       at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>       at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
>       at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
>       at org.apache.spark.deploy.client.AppClient.<init>(AppClient.scala:48)
>       at 
> org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend.start(SparkDeploySchedulerBackend.scala:93)
>       at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to