[ https://issues.apache.org/jira/browse/SPARK-11909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15021153#comment-15021153 ]
Jacek Laskowski commented on SPARK-11909: ----------------------------------------- _"The default is not a well-known port like 80 for HTTP"_ - that's exactly the reason why I filed the issue. Since it's not well-known it's hard to remember it and hence not very easy for people new to Spark. I experienced the mental "pain" today when I started Spark Standalone and had to remember the number to create SparkContext properly. Less to remember => less confusion => more happy users. > Spark Standalone's master URL accepts URLs without port (assuming default > 7077) > ------------------------------------------------------------------------------- > > Key: SPARK-11909 > URL: https://issues.apache.org/jira/browse/SPARK-11909 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 1.6.0 > Reporter: Jacek Laskowski > Priority: Trivial > > It's currently impossible to use {{spark://localhost}} URL for Spark > Standalone's master. With the feature supported, it'd be less to know to get > started with the mode (and hence improve user friendliness). > I think no-port master URL should be supported and assume the default port > {{7077}}. > {code} > org.apache.spark.SparkException: Invalid master URL: spark://localhost > at > org.apache.spark.util.Utils$.extractHostPortFromSparkUrl(Utils.scala:2088) > at org.apache.spark.rpc.RpcAddress$.fromSparkURL(RpcAddress.scala:47) > at > org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48) > at > org.apache.spark.deploy.client.AppClient$$anonfun$1.apply(AppClient.scala:48) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245) > at > scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245) > at > scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) > at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) > at scala.collection.TraversableLike$class.map(TraversableLike.scala:245) > at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186) > at org.apache.spark.deploy.client.AppClient.<init>(AppClient.scala:48) > at > org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend.start(SparkDeploySchedulerBackend.scala:93) > at > org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:530) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org