Re: Default spark.deploy.recoveryMode

2014-10-15 Thread Prashant Sharma
[Removing dev lists] You are absolutely correct about that. Prashant Sharma On Tue, Oct 14, 2014 at 5:03 PM, Priya Ch learnings.chitt...@gmail.com wrote: Hi Spark users/experts, In Spark source code (Master.scala Worker.scala), when registering the worker with master, I see the usage

Re: Default spark.deploy.recoveryMode

2014-10-15 Thread Chitturi Padma
which means the details are not persisted and hence any failures in workers and master wouldnt start the daemons normally ..right ? On Wed, Oct 15, 2014 at 12:17 PM, Prashant Sharma [via Apache Spark User List] ml-node+s1001560n16468...@n3.nabble.com wrote: [Removing dev lists] You are

Re: Default spark.deploy.recoveryMode

2014-10-15 Thread Prashant Sharma
-send_instant_email%21nabble%3Aemail.naml -- View this message in context: Re: Default spark.deploy.recoveryMode http://apache-spark-user-list.1001560.n3.nabble.com/Default-spark-deploy-recoveryMode-tp16375p16483.html Sent from the Apache Spark User List mailing list archive

Default spark.deploy.recoveryMode

2014-10-14 Thread Priya Ch
Hi Spark users/experts, In Spark source code (Master.scala Worker.scala), when registering the worker with master, I see the usage of *persistenceEngine*. When we don't specify spark.deploy.recovery mode explicitly, what is the default value used ? This recovery mode is used to persists and