[ 
https://issues.apache.org/jira/browse/SPARK-10368?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-10368:
------------------------------
      Priority: Minor  (was: Major)
    Issue Type: Improvement  (was: Bug)

I'm not sure it's a bug; the error is reported correctly. Failing fast for this 
and other types of errors is probably appropriate, but maybe not all. That 
would give a better user experience as you kind of have to dig to see this has 
nothing to do with sqlContext

> "Could not parse Master URL" leaves spark-shell unusable
> --------------------------------------------------------
>
>                 Key: SPARK-10368
>                 URL: https://issues.apache.org/jira/browse/SPARK-10368
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Spark built from today's sources: 
> {{f0f563a3c43fc9683e6920890cce44611c0c5f4b}}
>            Reporter: Jacek Laskowski
>            Priority: Minor
>
> When executing {{spark-shell}} with incorrect value for {{--master}} the 
> exception is thrown (twice!), but the shell remains open and is almost 
> unusable.
> {code}
> ➜  spark git:(master) ✗ ./bin/spark-shell --master mesos:localhost:8080
> log4j:WARN No appenders could be found for logger 
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
> info.
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> 15/08/31 15:00:10 INFO SecurityManager: Changing view acls to: jacek
> 15/08/31 15:00:10 INFO SecurityManager: Changing modify acls to: jacek
> 15/08/31 15:00:10 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(jacek); users 
> with modify permissions: Set(jacek)
> 15/08/31 15:00:11 INFO HttpServer: Starting HTTP Server
> 15/08/31 15:00:11 INFO Utils: Successfully started service 'HTTP server' on 
> port 56110.
> 15/08/31 15:00:14 INFO Main: Spark class server started at 
> http://192.168.99.1:56110
> 15/08/31 15:00:14 INFO SparkContext: Running Spark version 1.5.0-SNAPSHOT
> 15/08/31 15:00:14 INFO SecurityManager: Changing view acls to: jacek
> 15/08/31 15:00:14 INFO SecurityManager: Changing modify acls to: jacek
> 15/08/31 15:00:14 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(jacek); users 
> with modify permissions: Set(jacek)
> 15/08/31 15:00:15 INFO Slf4jLogger: Slf4jLogger started
> 15/08/31 15:00:15 INFO Remoting: Starting remoting
> 15/08/31 15:00:15 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://sparkDriver@192.168.99.1:56112]
> 15/08/31 15:00:15 INFO Utils: Successfully started service 'sparkDriver' on 
> port 56112.
> 15/08/31 15:00:15 INFO SparkEnv: Registering MapOutputTracker
> 15/08/31 15:00:15 INFO SparkEnv: Registering BlockManagerMaster
> 15/08/31 15:00:15 INFO DiskBlockManager: Created local directory at 
> /private/var/folders/0w/kb0d3rqn4zb9fcc91pxhgn8w0000gn/T/blockmgr-291429cd-8ca8-4622-87e5-b4c7ee68afcd
> 15/08/31 15:00:15 INFO MemoryStore: MemoryStore started with capacity 530.0 MB
> 15/08/31 15:00:15 INFO HttpFileServer: HTTP File server directory is 
> /private/var/folders/0w/kb0d3rqn4zb9fcc91pxhgn8w0000gn/T/spark-803119c2-e940-4817-a166-836f16c9027d/httpd-585b0b63-0512-4ae9-8739-8032c4125c77
> 15/08/31 15:00:15 INFO HttpServer: Starting HTTP Server
> 15/08/31 15:00:15 INFO Utils: Successfully started service 'HTTP file server' 
> on port 56113.
> 15/08/31 15:00:15 INFO SparkEnv: Registering OutputCommitCoordinator
> 15/08/31 15:00:15 INFO Utils: Successfully started service 'SparkUI' on port 
> 4040.
> 15/08/31 15:00:15 INFO SparkUI: Started SparkUI at http://192.168.99.1:4040
> 15/08/31 15:00:15 ERROR SparkContext: Error initializing SparkContext.
> org.apache.spark.SparkException: Could not parse Master URL: 
> 'mesos:localhost:8080'
>       at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2693)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:506)
>       at org.apache.spark.repl.Main$.createSparkContext(Main.scala:78)
>       at $line3.$read$$iw$$iw.<init>(<console>:12)
>       at $line3.$read$$iw.<init>(<console>:21)
>       at $line3.$read.<init>(<console>:23)
>       at $line3.$read$.<init>(<console>:27)
>       at $line3.$read$.<clinit>(<console>)
>       at $line3.$eval$.$print$lzycompute(<console>:7)
>       at $line3.$eval$.$print(<console>:6)
>       at $line3.$eval.$print(<console>)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:784)
>       at 
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1039)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:636)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:635)
>       at 
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
>       at 
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:635)
>       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
>       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:563)
>       at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:802)
>       at 
> scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:836)
>       at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:694)
>       at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcZ$sp(SparkILoop.scala:39)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
>       at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:213)
>       at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:38)
>       at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:922)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>       at 
> scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>       at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
>       at org.apache.spark.repl.Main$.main(Main.scala:48)
>       at org.apache.spark.repl.Main.main(Main.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/08/31 15:00:15 INFO SparkUI: Stopped Spark web UI at 
> http://192.168.99.1:4040
> 15/08/31 15:00:15 INFO MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 15/08/31 15:00:15 ERROR Utils: Uncaught exception in thread main
> java.lang.NullPointerException
>       at 
> org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
>       at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1238)
>       at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
>       at 
> org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1740)
>       at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
>       at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
>       at org.apache.spark.repl.Main$.createSparkContext(Main.scala:78)
>       at $line3.$read$$iw$$iw.<init>(<console>:12)
>       at $line3.$read$$iw.<init>(<console>:21)
>       at $line3.$read.<init>(<console>:23)
>       at $line3.$read$.<init>(<console>:27)
>       at $line3.$read$.<clinit>(<console>)
>       at $line3.$eval$.$print$lzycompute(<console>:7)
>       at $line3.$eval$.$print(<console>:6)
>       at $line3.$eval.$print(<console>)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:784)
>       at 
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1039)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:636)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:635)
>       at 
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
>       at 
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
>       at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:635)
>       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
>       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:563)
>       at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:802)
>       at 
> scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:836)
>       at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:694)
>       at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcZ$sp(SparkILoop.scala:39)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
>       at 
> org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:38)
>       at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:213)
>       at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:38)
>       at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:94)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:922)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>       at 
> scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
>       at 
> scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
>       at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
>       at org.apache.spark.repl.Main$.main(Main.scala:48)
>       at org.apache.spark.repl.Main.main(Main.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:497)
>       at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/08/31 15:00:15 INFO SparkContext: Successfully stopped SparkContext
> org.apache.spark.SparkException: Could not parse Master URL: 
> 'mesos:localhost:8080'
>   at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2693)
>   at org.apache.spark.SparkContext.<init>(SparkContext.scala:506)
>   at org.apache.spark.repl.Main$.createSparkContext(Main.scala:78)
>   ... 47 elided
> java.lang.NullPointerException
>   at 
> org.apache.spark.sql.execution.ui.SQLListener.<init>(SQLListener.scala:34)
>   at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:77)
>   at org.apache.spark.repl.Main$.createSQLContext(Main.scala:92)
>   ... 47 elided
> <console>:13: error: not found: value sqlContext
>        import sqlContext.implicits._
>               ^
> <console>:13: error: not found: value sqlContext
>        import sqlContext.sql
>               ^
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
>       /_/
> Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala>
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to