[ 
https://issues.apache.org/jira/browse/SPARK-26998?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16789653#comment-16789653
 ] 

Gabor Somogyi edited comment on SPARK-26998 at 3/11/19 2:47 PM:
----------------------------------------------------------------

Since the first part of the PR solved (http URLs in case of secure mode) 
continuing with the second issue.
In my view the problem can be mitigated to ask users to provide configuration 
parameters in configuration file (several commercial products does this)
* Either spark-defaults.conf
* or --properties-file

That way the command line options will show either nothing (spark-defaults.conf 
picked up by default) or something like "... --properties-file 
my-secret-spark-properties.conf ...".
As a side note this workaround is available at the moment but I would like to 
warn users for such situations.

The other approach what I've considered (and abandoned) is to open a pipe and 
send the password through this channel but since this approach is not really 
conform with Spark's configuration system
it would imply heavy changes and don't see the return of investment.

[~vanzin] what do you think since you have quite a bit experience with security?



was (Author: gsomogyi):
Since the first part of the PR solved (http URLs in case of secure mode) 
continuing with the second issue.
In my view the problem can be mitigated to ask users to provide configuration 
parameters either in configuration file (several commercial products does this)
* Either spark-defaults.conf
* or --properties-file

That way the command line options will show either nothing (spark-defaults.conf 
picked up by default) or something like "... --properties-file 
my-secret-spark-properties.conf ...".
As a side note this workaround is available at the moment but I would like to 
warn users for such situations.

The other approach what I've considered (and abandoned) is to open a pipe and 
send the password through this channel but since this approach is not really 
conform with Spark's configuration system
it would imply heavy changes and don't see the return of investment.

[~vanzin] what do you think since you have quite a bit experience with security?


> spark.ssl.keyStorePassword in plaintext on 'ps -ef' output of executor 
> processes in Standalone mode
> ---------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26998
>                 URL: https://issues.apache.org/jira/browse/SPARK-26998
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler, Security, Spark Core
>    Affects Versions: 2.3.3, 2.4.0
>            Reporter: t oo
>            Priority: Major
>              Labels: SECURITY, Security, secur, security, security-issue
>
> Run spark standalone mode, then start a spark-submit requiring at least 1 
> executor. Do a 'ps -ef' on linux (ie putty terminal) and you will be able to 
> see  spark.ssl.keyStorePassword value in plaintext!
>  
> spark.ssl.keyStorePassword and  spark.ssl.keyPassword don't need to be passed 
> to  CoarseGrainedExecutorBackend. Only  spark.ssl.trustStorePassword is used.
>  
> Can be resolved if below PR is merged:
> [[Github] Pull Request #21514 
> (tooptoop4)|https://github.com/apache/spark/pull/21514]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to