Ah I see - So its more like 're-used stages' which is not necessarily a bug
in the program or something like that.
Thanks for the pointer to the comment

Thanks
Shivaram

On Wed, Jan 7, 2015 at 2:00 PM, Mark Hamstra <m...@clearstorydata.com>
wrote:

> That's what you want to see.  The computation of a stage is skipped if the
> results for that stage are still available from the evaluation of a prior
> job run:
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/ui/jobs/JobProgressListener.scala#L163
>
> On Wed, Jan 7, 2015 at 12:32 PM, Corey Nolet <cjno...@gmail.com> wrote:
>
>> Sorry- replace ### with an actual number. What does a "skipped" stage
>> mean? I'm running a series of jobs and it seems like after a certain point,
>> the number of skipped stages is larger than the number of actual completed
>> stages.
>>
>> On Wed, Jan 7, 2015 at 3:28 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>>> Looks like the number of skipped stages couldn't be formatted.
>>>
>>> Cheers
>>>
>>> On Wed, Jan 7, 2015 at 12:08 PM, Corey Nolet <cjno...@gmail.com> wrote:
>>>
>>>> We just upgraded to Spark 1.2.0 and we're seeing this in the UI.
>>>>
>>>
>>>
>>
>

Reply via email to