https://spark.apache.org/docs/latest/monitoring.html

 

also subscribe to various Listeners for various Metrcis Types e.g. Job 
Stats/Statuses  - this will allow you (in the driver) to decide when to stop  
the context gracefully (the listening and stopping can be done from a 
completely separate thread in the driver)

 

https://spark.apache.org/docs/latest/api/java/

 

org.apache.spark.ui.jobs


Class JobProgressListener


·         Object

·          

·         org.apache.spark.ui.jobs.JobProgressListener

·         All Implemented Interfaces:

Logging 
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/Logging.html> , 
SparkListener 
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/scheduler/SparkListener.html>
 

  _____  

 

public class JobProgressListener

extends Object

implements SparkListener 
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/scheduler/SparkListener.html>
 , Logging 
<https://spark.apache.org/docs/latest/api/java/org/apache/spark/Logging.html> 

:: DeveloperApi :: Tracks task-level information to be displayed in the UI. 

All access to the data structures in this class must be synchronized on the 
class, since the UI thread and the EventBus loop may otherwise be reading and 
updating the internal data structures concurrently.

·          

·          

 

 

From: Krot Viacheslav [mailto:krot.vyaches...@gmail.com] 
Sent: Tuesday, June 16, 2015 2:35 PM
To: user@spark.apache.org
Subject: stop streaming context of job failure

 

Hi all,

Is there a way to stop streaming context when some batch processing failed?

I want to set reasonable reties count, say 10, and if failed - stop context 
completely.

Is that possible?

Reply via email to