[jira] [Assigned] (SPARK-10500) sparkr.zip cannot be created if $SPARK_HOME/R/lib is unwritable

2015-10-31 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-10500?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-10500:


Assignee: Apache Spark

> sparkr.zip cannot be created if $SPARK_HOME/R/lib is unwritable
> ---
>
> Key: SPARK-10500
> URL: https://issues.apache.org/jira/browse/SPARK-10500
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 1.5.0
>Reporter: Jonathan Kelly
>Assignee: Apache Spark
>
> As of SPARK-6797, sparkr.zip is re-created each time spark-submit is run with 
> an R application, which fails if Spark has been installed into a directory to 
> which the current user doesn't have write permissions. (e.g., on EMR's 
> emr-4.0.0 release, Spark is installed at /usr/lib/spark, which is only 
> writable by root.)
> Would it be possible to skip creating sparkr.zip if it already exists? That 
> would enable sparkr.zip to be pre-created by the root user and then reused 
> each time spark-submit is run, which I believe is similar to how pyspark 
> works.
> Another option would be to make the location configurable, as it's currently 
> hardcoded to $SPARK_HOME/R/lib/sparkr.zip. Allowing it to be configured to 
> something like the user's home directory or a random path in /tmp would get 
> around the permissions issue.
> By the way, why does spark-submit even need to re-create sparkr.zip every 
> time a new R application is launched? This seems unnecessary and inefficient, 
> unless you are actively developing the SparkR libraries and expect the 
> contents of sparkr.zip to change.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-10500) sparkr.zip cannot be created if $SPARK_HOME/R/lib is unwritable

2015-10-31 Thread Apache Spark (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-10500?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-10500:


Assignee: (was: Apache Spark)

> sparkr.zip cannot be created if $SPARK_HOME/R/lib is unwritable
> ---
>
> Key: SPARK-10500
> URL: https://issues.apache.org/jira/browse/SPARK-10500
> Project: Spark
>  Issue Type: Bug
>  Components: SparkR
>Affects Versions: 1.5.0
>Reporter: Jonathan Kelly
>
> As of SPARK-6797, sparkr.zip is re-created each time spark-submit is run with 
> an R application, which fails if Spark has been installed into a directory to 
> which the current user doesn't have write permissions. (e.g., on EMR's 
> emr-4.0.0 release, Spark is installed at /usr/lib/spark, which is only 
> writable by root.)
> Would it be possible to skip creating sparkr.zip if it already exists? That 
> would enable sparkr.zip to be pre-created by the root user and then reused 
> each time spark-submit is run, which I believe is similar to how pyspark 
> works.
> Another option would be to make the location configurable, as it's currently 
> hardcoded to $SPARK_HOME/R/lib/sparkr.zip. Allowing it to be configured to 
> something like the user's home directory or a random path in /tmp would get 
> around the permissions issue.
> By the way, why does spark-submit even need to re-create sparkr.zip every 
> time a new R application is launched? This seems unnecessary and inefficient, 
> unless you are actively developing the SparkR libraries and expect the 
> contents of sparkr.zip to change.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org