Re: Installing Spark Standalone to a Cluster

2015-01-23 Thread HARIPRIYA AYYALASOMAYAJULA
not needed. You can directly follow the installation
However you might need sbt to package your files to jar.

On Fri, Jan 23, 2015 at 11:54 AM, riginos samarasrigi...@gmail.com wrote:

 Do i need to manually install and configure hadoop before doing anything
 with
 spark standalone?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-Standalone-to-a-Cluster-tp21339.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Regards,
Haripriya Ayyalasomayajula
Graduate Student
Department of Computer Science
University of Houston
Contact : 650-796-7112


Re: Installing Spark Standalone to a Cluster

2015-01-23 Thread Akhil Das
Which variable is that you don't understand?

Here's a minimalistic spark-env.sh of mine.

export SPARK_MASTER_IP=192.168.10.28

export HADOOP_CONF_DIR=/home/akhld/sigmoid/localcluster/hadoop/conf
export HADOOP_HOME=/home/akhld/sigmoid/localcluster/hadoop/



Thanks
Best Regards

On Fri, Jan 23, 2015 at 11:50 PM, riginos samarasrigi...@gmail.com wrote:

 I need someone to send me a snapshot of his /conf/spark-env.sh file cause i
 don't understand how to set some vars like SPARK_MASTER etc



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-Standalone-to-a-Cluster-tp21341.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Installing Spark Standalone to a Cluster

2015-01-22 Thread Yana Kadiyska
You can do ./sbin/start-slave.sh --master spark://IP:PORT. I believe you're
missing --master. In addition, it's a good idea to pass with --master
exactly the spark master's endpoint as shown on your UI under
http://localhost:8080. But that should do it. If that's not working, you
can look at the Worker log and see where it's trying to connect to and if
it's getting any errors.

On Thu, Jan 22, 2015 at 12:06 PM, riginos samarasrigi...@gmail.com wrote:

 I have downloaded spark-1.2.0.tgz on each of my node and execute ./sbt/sbt
 assembly on each of them.  So I execute. /sbin/start-master.sh on my master
 and ./bin/spark-class org.apache.spark.deploy.worker.Worker
 spark://IP:PORT.
 Althought when I got to http://localhost:8080 I cannot see any worker. Why
 is that? Do I do something wrong with the installation deploy of the spark?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-Standalone-to-a-Cluster-tp21319.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org