[ 
https://issues.apache.org/jira/browse/SPARK-8596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14599739#comment-14599739
 ] 

Guorong Xu commented on SPARK-8596:
-----------------------------------

When I install Spark on EC2 following ec2-script, I assume the Spark should be 
installed on the driver node. If I install Spark in /home/rstudio on the driver 
node again, then I will have two copies of Spark installation on the drive 
node. Will Rstudio submit jobs to the right Spark and do computing cross all 
worker nodes?

> Install and configure RStudio server on Spark EC2
> -------------------------------------------------
>
>                 Key: SPARK-8596
>                 URL: https://issues.apache.org/jira/browse/SPARK-8596
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2, SparkR
>            Reporter: Shivaram Venkataraman
>
> This will make it convenient for R users to use SparkR from their browsers 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to