The Spark UI has timing information. When running locally, it is at
http://localhost:4040
Otherwise the url to the UI is printed out onto the console when you
startup spark shell or run a job.

Reza

On Fri, Oct 24, 2014 at 5:51 AM, shahab <shahab.mok...@gmail.com> wrote:

> Hi,
>
> I just wonder if there is any built-in function to get the execution time
> for each of the jobs/tasks ? in simple words, how can I find out how much
> time is spent on loading/mapping/filtering/reducing part of a job? I can
> see printout in the logs but since there is no clear presentation of the
> underlying DAG and associated tasks it is hard to find what I am looking
> for.
>
> best,
> /Shahab
>

Reply via email to