[ https://issues.apache.org/jira/browse/SPARK-6568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14491253#comment-14491253 ]
Jarl Haggerty commented on SPARK-6568: -------------------------------------- When I try to run the following command the back slashes in the file names get removed and I get an IllegalArgumentException. This is under both Spark 1.3.0 and 1.2.1. PS C:\Users\jarlhaggerty> C:\spark\bin\pyspark.cmd --master local --jars C:\Users\jarlhaggerty\Miniconda3\envs\py27\Lib\ site-packages\thunder\lib\thunder_2.10-0.5.0.jar --driver-class-path C:\Users\jarlhaggerty\Miniconda3\envs\py27\lib\site -packages\thunder\lib\thunder_2.10-0.5.0.jar Running C:\Users\jarlhaggerty\Miniconda3\envs\py27\python.exe with PYTHONPATH=C:\spark\bin\..\python\lib\py4j-0.8.2.1-sr c.zip;C:\spark\bin\..\python; Python 2.7.9 |Anaconda 2.2.0 (64-bit)| (default, Dec 18 2014, 16:57:52) [MSC v.1500 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information. Anaconda is brought to you by Continuum Analytics. Please check out: http://continuum.io/thanks and https://binstar.org Exception in thread "main" java.lang.IllegalArgumentException: Given path is malformed: C:UsersjarlhaggertyMiniconda3env spy27Libsite-packagesthunderlibthunder_2.10-0.5.0.jar at org.apache.spark.util.Utils$.resolveURI(Utils.scala:1665) at org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1687) at org.apache.spark.util.Utils$$anonfun$resolveURIs$1.apply(Utils.scala:1687) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108) at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108) at org.apache.spark.util.Utils$.resolveURIs(Utils.scala:1687) at org.apache.spark.deploy.SparkSubmitArguments.parse$1(SparkSubmitArguments.scala:391) at org.apache.spark.deploy.SparkSubmitArguments.parseOpts(SparkSubmitArguments.scala:288) at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:87) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:105) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Traceback (most recent call last): File "C:\spark\bin\..\python\pyspark\shell.py", line 50, in <module> sc = SparkContext(appName="PySparkShell", pyFiles=add_files) File "C:\spark\python\pyspark\context.py", line 108, in __init__ SparkContext._ensure_initialized(self, gateway=gateway) File "C:\spark\python\pyspark\context.py", line 222, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway() File "C:\spark\python\pyspark\java_gateway.py", line 80, in launch_gateway raise Exception("Java gateway process exited before sending the driver its port number") Exception: Java gateway process exited before sending the driver its port number > spark-shell.cmd --jars option does not accept the jar that has space in its > path > -------------------------------------------------------------------------------- > > Key: SPARK-6568 > URL: https://issues.apache.org/jira/browse/SPARK-6568 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.3.0 > Environment: Windows 8.1 > Reporter: Masayoshi TSUZUKI > > spark-shell.cmd --jars option does not accept the jar that has space in its > path. > The path of jar sometimes containes space in Windows. > {code} > bin\spark-shell.cmd --jars "C:\Program Files\some\jar1.jar" > {code} > this gets > {code} > Exception in thread "main" java.net.URISyntaxException: Illegal character in > path at index 10: C:/Program Files/some/jar1.jar > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org