[ 
https://issues.apache.org/jira/browse/SPARK-2722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14259459#comment-14259459
 ] 

Sean Owen commented on SPARK-2722:
----------------------------------

The first escaping makes sense and is unavoidable since it is for the benefit 
of the shell. The other escaping has to happen as well at some point for the 
benefit of the shell running the executor. Is the suggestion that Spark do that 
escaping? It would hide the implementation detail. It would change behavior 
too. Just checking what people are interested in  implementing here before a 
bit of work is done. 

> Mechanism for escaping spark configs is not consistent
> ------------------------------------------------------
>
>                 Key: SPARK-2722
>                 URL: https://issues.apache.org/jira/browse/SPARK-2722
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.1
>            Reporter: Andrew Or
>            Priority: Minor
>
> Currently, you can specify a spark config in spark-defaults.conf as follows:
> {code}
> spark.magic "Mr. Johnson"
> {code}
> and this will preserve the double quotes as part of the string. Naturally, if 
> you want to do the equivalent in spark.*.extraJavaOptions, you would use the 
> following:
> {code}
> spark.executor.extraJavaOptions "-Dmagic=\"Mr. Johnson\""
> {code}
> However, this fails because the backslashes go away and it tries to interpret 
> "Johnson" as the main class argument. Instead, you have to do the following:
> {code}
> spark.executor.extraJavaOptions "-Dmagic=\\\"Mr. Johnson\\\""
> {code}
> which is not super intuitive.
> Note that this only applies to standalone mode. In YARN it's not even 
> possible to use quoted strings in config values (SPARK-2718).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to