Ming Li created SPARK-38807: ------------------------------- Summary: Error when starting spark shell on Windows system Key: SPARK-38807 URL: https://issues.apache.org/jira/browse/SPARK-38807 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 3.2.1 Reporter: Ming Li
Using the release version of spark-3.2.1 and the default configuration, starting spark shell on Windows system fails. (spark 3.1.2 doesn't show this issue) Here is the stack trace of the exception: {code:java} 22/04/06 21:47:45 ERROR SparkContext: Error initializing SparkContext. java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ... at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.net.URISyntaxException: Illegal character in path at index 30: spark://192.168.X.X:56964/F:\classes at java.net.URI$Parser.fail(URI.java:2845) at java.net.URI$Parser.checkChars(URI.java:3018) at java.net.URI$Parser.parseHierarchical(URI.java:3102) at java.net.URI$Parser.parse(URI.java:3050) at java.net.URI.<init>(URI.java:588) at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57) ... 70 more 22/04/06 21:47:45 ERROR Utils: Uncaught exception in thread main java.lang.NullPointerException ... {code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org