On Thu, Dec 3, 2009 at 10:09 AM, Dimitri Glazkov <dglaz...@google.com>wrote:
> On Thu, Dec 3, 2009 at 10:07 AM, Ojan Vafai <o...@google.com> wrote: > > +chromium-dev as others who look at the waterfall might also be confused. > > On Thu, Dec 3, 2009 at 8:50 AM, Dimitri Glazkov <dglaz...@google.com> > wrote: > >> > >> Sending out random people, because it's early :) > >> > >> There's a couple of things I see on the bot this morning: > >> > >> 1) There's a crashing test on all bots -- and the tree is still green! > >> > >> > http://src.chromium.org/viewvc/chrome/trunk/src/webkit/tools/layout_tests/flakiness_dashboard.html#tests=LayoutTests/plugins/embed-attributes-setting.html > > > > > > The test is consistently crashing when run with all the other tests, but > > passing when we retry it in isolation. Note that the test is listed as an > > unexpected flaky test on the waterfall. This is one of the downsides of > > retrying failing tests. We can't distinguish flakiness from this case. We > > just need to careful to not ignore unexpected flakiness on the waterfall. > > Note that the dashboard only shows the result from the first run. > Including > > the retry results from the bots seems like more trouble than it's worth. > > Should unexpected flakiness turn the bot red? > If it turns the bot red, then it defeats the purpose of that code. Might as well not retry and mark it as FAIL. (which turns the tree red). Nicolas > :DG< > -- Chromium Developers mailing list: chromium-dev@googlegroups.com View archives, change email options, or unsubscribe: http://groups.google.com/group/chromium-dev