How to Reload Spark Configuration Files

2014-06-24 Thread Sirisha Devineni
Hi All,

I am working with Spark to add new slaves automatically when there is more data 
to be processed by the cluster. During this process there is question arisen, 
after adding/removing new slave node to/from the spark cluster do we need to 
restart master and other existing slaves in the cluster?

From my observations:

1.   If a new slave node details are added in configuration 
files(/root/spark/conf/salves) on master node , running start-slaves.sh 
script will add the new slave to cluster without affecting  existing slaves or 
master.

2.   If a slave details are removed from the configuration file, one need 
to restart master using stop-master.sh and start-master.sh scripts to take 
effect.

Is there any reload option available in Spark to load the changed configuration 
files without stopping the services. Here stopping the service of master or 
existing salves may lead to outage of services.
You can find the options available to start/stop the services of spark at 
http://spark.apache.org/docs/latest/spark-standalone.html


Thanks  Regards,
Sirisha Devineni.

DISCLAIMER
==
This e-mail may contain privileged and confidential information which is the 
property of Persistent Systems Ltd. It is intended only for the use of the 
individual or entity to which it is addressed. If you are not the intended 
recipient, you are not authorized to read, retain, copy, print, distribute or 
use this message. If you have received this communication in error, please 
notify the sender and delete all copies of this message. Persistent Systems 
Ltd. does not accept any liability for virus infected mails.



Re: How to Reload Spark Configuration Files

2014-06-24 Thread Mayur Rustagi
Not really. You are better off using a cluster manager like Mesos or Yarn
for this.

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi https://twitter.com/mayur_rustagi



On Tue, Jun 24, 2014 at 11:35 AM, Sirisha Devineni 
sirisha_devin...@persistent.co.in wrote:

  Hi All,



 I am working with Spark to add new slaves automatically when there is more
 data to be processed by the cluster. During this process there is question
 arisen, after adding/removing new slave node to/from the spark cluster do
 we need to restart master and other existing slaves in the cluster?



 From my observations:

 1.   If a new slave node details are added in configuration
 files(/root/spark/conf/salves) on master node , running “start-slaves.sh”
 script will add the new slave to cluster without affecting  existing slaves
 or master.

 2.   If a slave details are removed from the configuration file, one
 need to restart master using stop-master.sh and start-master.sh scripts to
 take effect.



 Is there any reload option available in Spark to load the changed
 configuration files without stopping the services. Here stopping the
 service of master or existing salves may lead to outage of services.

 You can find the options available to start/stop the services of spark at
 http://spark.apache.org/docs/latest/spark-standalone.html





 Thanks  Regards,

 Sirisha Devineni.

 DISCLAIMER == This e-mail may contain privileged and confidential
 information which is the property of Persistent Systems Ltd. It is intended
 only for the use of the individual or entity to which it is addressed. If
 you are not the intended recipient, you are not authorized to read, retain,
 copy, print, distribute or use this message. If you have received this
 communication in error, please notify the sender and delete all copies of
 this message. Persistent Systems Ltd. does not accept any liability for
 virus infected mails.



Re: How to Reload Spark Configuration Files

2014-06-24 Thread Peng Cheng
I've read somewhere that in 1.0 there is a bash tool called 'spark-config.sh'
that allows you to propagate your config files to a number of master and
slave nodes. However I haven't use it myself



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-Reload-Spark-Configuration-Files-tp8159p8219.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.