[ 
https://issues.apache.org/jira/browse/SPARK-11909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15021167#comment-15021167
 ] 

Sean Owen commented on SPARK-11909:
-----------------------------------

I think that cuts the other way. You're helping people not think about what 
port the master they're talking to is running on, which is probably more 
confusing than explicitly stating the port, especially if you accidentally talk 
to the wrong one somehow.

> Spark Standalone's master URL accepts URLs without port (assuming default 
> 7077)
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-11909
>                 URL: https://issues.apache.org/jira/browse/SPARK-11909
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Jacek Laskowski
>            Priority: Trivial
>
> It's currently impossible to use {{spark://localhost}} URL for Spark 
> Standalone's master. With the feature supported, it'd be less to know to get 
> started with the mode (and hence improve user friendliness).
> I think no-port master URL should be supported and assume the default port 
> {{7077}}.
> {code}
> org.apache.spark.SparkException: Invalid master URL: spark://localhost
>       at 
> org.apache.spark.util.Utils$.extractHostPortFromSparkUrl(Utils.scala:2088)
>       at org.apache.spark.rpc.RpcAddress$.fromSparkURL(RpcAddress.scala:47)
>       at 
> org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48)
>       at 
> org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>       at 
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>       at 
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>       at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>       at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
>       at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
>       at org.apache.spark.deploy.client.AppClient.<init>(AppClient.scala:48)
>       at 
> org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend.start(SparkDeploySchedulerBackend.scala:93)
>       at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to