I started some quick hack for that in the notebook, you can head to:
https://github.com/andypetrella/spark-notebook/blob/master/common/src/main/scala/notebook/front/widgets/SparkInfo.scala

On Tue Nov 18 2014 at 2:44:48 PM Aniket Bhatnagar <
aniket.bhatna...@gmail.com> wrote:

> I am writing yet another Spark job server and have been able to submit
> jobs and return/save results. I let multiple jobs use the same spark
> context but I set job group while firing each job so that I can in future
> cancel jobs. Further, what I deserve to do is provide some kind of status
> update/progress on running jobs (a % completion but be awesome) but I am
> unable to figure out appropriate spark API to use. I do however see status
> reporting in spark UI so there must be a way to get status of various
> stages per job group. Any hints on what APIs should I look at?

Reply via email to