Your exception handling is occurring on the driver, where you 'configure' the job. I don't think it's what you mean to do. You probably mean to do this within a function you are executing on data within the cluster, like mapPartitions etc.
On Fri, Sep 25, 2015 at 5:20 AM, Samya <samya.ma...@amadeus.com> wrote: > Hi Team, > > I have a code piece as follows. > > try{ > someDstream.someaction(.......) //Step1 > }catch{ > case ex:Exception =>{ > someDstream.someaction(.......) //Step2 > } > } > > When I get an exception for current batch, Step2 executes as expected. > > But for the next batches, when there is no exception, then also both step1 & > step2 executes. In this scenario I want only Step1 to execute. > > Regards, > Sam > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Stop-a-Dstream-computation-tp24816.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org