Good morning,

I am having a bit of trouble finalizing the installation and usage of the
newest Spark version 1.4.0, deploying to an Amazon EC2 instance and using
RStudio to run on top of it.  

Using these instructions (
http://spark.apache.org/docs/latest/ec2-scripts.html
<http://spark.apache.org/docs/latest/ec2-scripts.html>  ) we can fire up an
EC2 instance (which we have been successful doing - we have gotten the
cluster to launch from the command line without an issue).  Then, I
installed RStudio Server on the same EC2 instance (the master) and
successfully logged into it (using the test/test user) through the web
browser.

This is where I get stuck - within RStudio, when I try to reference/find the
folder that SparkR was installed, to load the SparkR library and initialize
a SparkContext, I get permissions errors on the folders, or the library
cannot be found because I cannot find the folder in which the library is
sitting.

Has anyone successfully launched and utilized SparkR 1.4.0 in this way, with
RStudio Server running on top of the master instance?  Are we on the right
track, or should we manually launch a cluster and attempt to connect to it
from another instance running R?

Thank you in advance!

Mark



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-0-Using-SparkR-on-EC2-Instance-tp23506.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to