If you look at the "startApplication" method it takes listeners as parameters.

On Fri, Oct 28, 2016 at 10:23 AM, Elkhan Dadashov <elkhan8...@gmail.com> wrote:
> Hi,
>
> I know that we can use SparkAppHandle (introduced in SparkLauncher version
>>=1.6), and lt the delegator map task stay alive until the Spark job
> finishes. But i wonder, if this can be done via callback notification
> instead of polling.
>
> Can i get callback notification on Spark job completion ?
>
> Similar to Hadoop, get a callback on MapReduce job completion - getting a
> notification instead of polling.
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value. Can be retrieved from notification URL
> both the JOB_ID and JOB_STATUS.
>
> ...
> Configuration conf = this.getConf();
> // Set the callback parameters
> conf.set("job.end.notification.url",
> "https://hadoopi.wordpress.com/api/hadoop/notification/$jobId?status=$jobStatus";);
> ...
> // Submit your job in background
> job.submit();
>
> At job completion, an HTTP request will be sent to
> “job.end.notification.url” value:
>
> https://<callback-rul>/api/hadoop/notification/job_1379509275868_0002?status=SUCCEEDED
>
> Reference:
> https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/
>
> Thanks.



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to