You can get detailed information through Spark listener interface regarding each stage. Multiple jobs may be compressed into a single stage so jobwise information would be same as Spark. Regards Mayur
Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi <https://twitter.com/mayur_rustagi> On Tue, Apr 1, 2014 at 11:18 AM, Kevin Markey <kevin.mar...@oracle.com>wrote: > The discussion there hits on the distinction of jobs and stages. When > looking at one application, there are hundreds of stages, sometimes > thousands. Depends on the data and the task. And the UI seems to track > stages. And one could independently track them for such a job. But what > if -- as occurs in another application -- there's only one or two stages, > but lots of data passing through those 1 or 2 stages? > > Kevin Markey > > > > On 04/01/2014 09:55 AM, Mark Hamstra wrote: > > Some related discussion: https://github.com/apache/spark/pull/246 > > > On Tue, Apr 1, 2014 at 8:43 AM, Philip Ogren <philip.og...@oracle.com>wrote: > >> Hi DB, >> >> Just wondering if you ever got an answer to your question about >> monitoring progress - either offline or through your own investigation. >> Any findings would be appreciated. >> >> Thanks, >> Philip >> >> >> On 01/30/2014 10:32 PM, DB Tsai wrote: >> >>> Hi guys, >>> >>> When we're running a very long job, we would like to show users the >>> current progress of map and reduce job. After looking at the api document, >>> I don't find anything for this. However, in Spark UI, I could see the >>> progress of the task. Is there anything I miss? >>> >>> Thanks. >>> >>> Sincerely, >>> >>> DB Tsai >>> Machine Learning Engineer >>> Alpine Data Labs >>> -------------------------------------- >>> Web: http://alpinenow.com/ >>> >> >> > >