Re: Spark Standalone Cluster: Having a master and worker on the same node

2016-07-28 Thread Chanh Le
Hi Jestin, I saw most of setup usually setup along master and slave in a same node. Because I think master doesn't do as much job as slave does and resource is expensive we need to use it. BTW In my setup I setup along master and slave. I have 5 nodes and 3 of which are master and slave running

Re: Spark Standalone Cluster: Having a master and worker on the same node

2016-07-27 Thread Mich Talebzadeh
Hi Justine. As I understand you are using Spark in standalone mode meaning that you start your master and slaves/worker processes. You can specify the number of works for each node in $SPARK_HOME/conf/spark-env.sh file as below # Options for the daemons used in the standalone deploy mode export

Spark Standalone Cluster: Having a master and worker on the same node

2016-07-27 Thread Jestin Ma
Hi, I'm doing performance testing and currently have 1 master node and 4 worker nodes and am submitting in client mode from a 6th cluster node. I know we can have a master and worker on the same node. Speaking in terms of performance and practicality, is it possible/suggested to have another