Hi All,

I am quite new to Spark. So please pardon me if it is a very basic question. 

I have setup a Hadoop cluster using Hortonwork's Ambari. It has 1 Master and
3 Worker nodes. Currently, it has HDFS, Yarn, MapReduce2, HBase and
ZooKeeper services installed. 

Now, I want to install Spark on it. How do I do that? I searched a lot
online, but there is no clear step-by-step installation guide to do that.
All I find is the standalone setup guides. Can someone provide steps? What
needs to be copied to each machine? Where and what config changes should be
made on each machine?

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-setup-a-Spark-Cluter-tp22326.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to