[ 
https://issues.apache.org/jira/browse/SPARK-50191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

bingzuochen updated SPARK-50191:
--------------------------------
    Description: 
In Yarn mode, the javaOptions configuration item is executed as the bash -c 
command, such as *spark.executor.extraJavaOptions,* 
*spark.executor.defaultJavaOptions,*  *spark.executor.extraJavaOptions,*  
{*}spark.driver.defaultJavaOptions{*}, which may cause arbitrary command 
execution risks. Can we add a configuration item to eliminate risks when the 
javaOptions configuration is not required?

 

resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala

 
{code:java}
/**
 * Escapes a string for inclusion in a command line executed by Yarn. Yarn 
executes commands
 * using either
 *
 * (Unix-based) `bash -c "command arg1 arg2"` and that means plain quoting 
doesn't really work.
 * The argument is enclosed in single quotes and some key characters are 
escaped.
 *
 * (Windows-based) part of a .cmd file in which case windows escaping for each 
argument must be
 * applied. Windows is quite lenient, however it is usually Java that causes 
trouble, needing to
 * distinguish between arguments starting with '-' and class names. If 
arguments are surrounded
 * by ' java takes the following string as is, hence an argument is mistakenly 
taken as a class
 * name which happens to start with a '-'. The way to avoid this, is to 
surround nothing with
 * a ', but instead with a ".
 *
 * @param arg A single argument.
 * @return Argument quoted for execution via Yarn's generated shell script.
 */ {code}

  was:In Yarn mode, the javaOptions configuration item is executed as the bash 
-c command, which may cause risks. Can we add a configuration item to eliminate 
risks when the javaOptions configuration is not required?


> Flexible configuration of javaoptions.
> --------------------------------------
>
>                 Key: SPARK-50191
>                 URL: https://issues.apache.org/jira/browse/SPARK-50191
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 3.5.3
>            Reporter: bingzuochen
>            Priority: Major
>
> In Yarn mode, the javaOptions configuration item is executed as the bash -c 
> command, such as *spark.executor.extraJavaOptions,* 
> *spark.executor.defaultJavaOptions,*  *spark.executor.extraJavaOptions,*  
> {*}spark.driver.defaultJavaOptions{*}, which may cause arbitrary command 
> execution risks. Can we add a configuration item to eliminate risks when the 
> javaOptions configuration is not required?
>  
> resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala
>  
> {code:java}
> /**
>  * Escapes a string for inclusion in a command line executed by Yarn. Yarn 
> executes commands
>  * using either
>  *
>  * (Unix-based) `bash -c "command arg1 arg2"` and that means plain quoting 
> doesn't really work.
>  * The argument is enclosed in single quotes and some key characters are 
> escaped.
>  *
>  * (Windows-based) part of a .cmd file in which case windows escaping for 
> each argument must be
>  * applied. Windows is quite lenient, however it is usually Java that causes 
> trouble, needing to
>  * distinguish between arguments starting with '-' and class names. If 
> arguments are surrounded
>  * by ' java takes the following string as is, hence an argument is 
> mistakenly taken as a class
>  * name which happens to start with a '-'. The way to avoid this, is to 
> surround nothing with
>  * a ', but instead with a ".
>  *
>  * @param arg A single argument.
>  * @return Argument quoted for execution via Yarn's generated shell script.
>  */ {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to