Marcin Kurczych created SPARK-23464: ---------------------------------------
Summary: MesosClusterScheduler double-escapes parameters to bash command Key: SPARK-23464 URL: https://issues.apache.org/jira/browse/SPARK-23464 Project: Spark Issue Type: Bug Components: Mesos Affects Versions: 2.2.0 Environment: Spark 2.2.0 with Mesosphere patches (but the problem exists in main repo) DC/OS 1.9.5 Reporter: Marcin Kurczych Parameters passed to driver launching command in Mesos container are escaped using _shellEscape_ function. In SPARK-18114 additional wrapping in double quotes has been introduced. This cancels out quoting done by _shellEscape_ and makes in unable to run tasks with whitespaces in parameters, as they are interpreted as additional parameters to in-container spark-submit. This is how parameter passed to in-container spark-submit looks like now: {code} --conf "spark.executor.extraJavaOptions="-Dfoo=\"first value\" -Dbar=another"" {code} This is how they look after reverting SPARK-18114 related commit: {code} --conf spark.executor.extraJavaOptions="-Dfoo=\"first value\" -Dbar=another" {code} In current version submitting job with such extraJavaOptions causes following error: {code} Error: Unrecognized option: -Dfoo=another Usage: spark-submit [options] <app jar | python file> [app arguments] Usage: spark-submit --kill [submission ID] --master [spark://...] Usage: spark-submit --status [submission ID] --master [spark://...] Usage: spark-submit run-example [options] example-class [example args] Options: --master MASTER_URL spark://host:port, mesos://host:port, yarn, or local. --deploy-mode DEPLOY_MODE Whether to launch the driver program locally ("client") or on one of the worker machines inside the cluster ("cluster") (Default: client). (... further spark-submit help ...) {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org