[ 
https://issues.apache.org/jira/browse/SPARK-6816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14567187#comment-14567187
 ] 

Rick Moritz commented on SPARK-6816:
------------------------------------

One current drawback with SparkR's configuration option is the inability to set 
driver VM-options. These are crucial, when attempting to run sparkR on a 
Hortonworks HDP, as both driver and appliation-master need to be aware of the 
hdp.version variable in order to resolve the classpath.

While it is possible to pass this variable to the executors, there's no way to 
pass this option to the driver, excepting the following exploit/work-around:

The SPARK_MEM variable can be abused to pass the required parameters to the 
driver's VM, by using String concatenation. Setting the variable to (e.g.)  
512m -Dhdp.version=NNN appends the -D option to the -X option which is 
currently read from this environment variable. Adding a secondary variable to 
the System.env which gets parsed for JVM options would be far more obvious and 
less hacky, or by adding a separate environment list for the driver, extending 
what's currently available for executors.

I'm adding this as a comment to this issue, since I believe it is sufficiently 
closely related not to warrant a separate issue.

> Add SparkConf API to configure SparkR
> -------------------------------------
>
>                 Key: SPARK-6816
>                 URL: https://issues.apache.org/jira/browse/SPARK-6816
>             Project: Spark
>          Issue Type: New Feature
>          Components: SparkR
>            Reporter: Shivaram Venkataraman
>            Priority: Minor
>
> Right now the only way to configure SparkR is to pass in arguments to 
> sparkR.init. The goal is to add an API similar to SparkConf on Scala/Python 
> to make configuration easier



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to