> If we really wanted to know, either someone would have to spend some time
> doing this over and over, or we'd have to use Telemetry with some A/B testing.

This would actually be a pretty easy thing to do, to a first
approximation anyway.  Just turn off PGO on Windows for one nightly
build and see how that affects all our metrics.

I'll grant that's not a proper A/B study, but it'd probably be good enough.

On Fri, Oct 19, 2012 at 9:55 PM, Dave Mandelin <dmande...@gmail.com> wrote:
> On Thursday, October 18, 2012 4:59:10 AM UTC-7, Ted Mielczarek wrote:
>> If you're interested in the benchmark side of things, it's fairly easy
>> to compare now that we build both PGO and non-PGO builds on a regular
>> basis. I'm having a little trouble getting graphserver to give me recent
>> data, but you can pick arbitrary tests that we run on Talos and graph
>> them side-by-side for the PGO and non-PGO cases. For example, here's Ts
>> and "Tp5 MozAfterPaint" for Windows 7 on both PGO and non-PGO builds
>> (the data ends in February for some reason):
>>
>> http://graphs.mozilla.org/graph.html#tests=[[16,1,12],[115,1,12],[16,94,12],[115,94,12]]&sel=none&displayrange=365&datatype=running
>>
>> You can see that there's a pretty solid 10-20% advantage to PGO in these
>> tests.
>
> Ah. That answers my question about more data.
>
> For Ts, I see a difference of only 70ms (e.g., 520-590 at the last point). 
> That's borderline trivial, but the differences I measure are much greater. 
> What does Ts actually measure, anyway? Is it measuring only from main() 
> starting to first paint, or something like that?
>
> For Tp5, I see a difference of 80ms (330-410 and such). I'm not really sure 
> what to make of that. By itself, it doesn't necessarily seem like it would 
> that noticeable, but the fraction is big enough that if it holds up for 
> longer and bigger pages, I could see it slightly improving pageloads and 
> probably also reducing some pauses for layout and such. From what I 
> understand about Tp5, it's not really measuring modern pageloads (ignores 
> network and isn't focused on popular sites). I wish we had something more 
> representative so we could draw better conclusions (and not just about PGO).
>
>> Here's Dromaeo (DOM) which displays a similar 20% advantage:
>>
>> http://graphs.mozilla.org/graph.html#tests=[[73,94,12],[73,1,12]]&sel=none&displayrange=365&datatype=running
>>
>> It's certainly hard to draw a conclusion about your hypothesis from just
>> benchmarks, but when almost all of our benchmarks display 10-20%
>> reductions on PGO builds it seems fair to say that that's likely to be
>> user-visible.
>
> It seems fair to me to say that core browser CPU-bound tasks are likely to be 
> 10-20% faster. There is probably some of that users can notice, although I'm 
> not sure exactly what it would be. The JS benchmarks do show faster in the 
> two builds, but I haven't tested other JS-based things to see if it's 
> noticeable. I guess I should be testing game framerates or something like 
> that too.
>
>> We've spent hundreds of man-hours for perf gains far less than that.
>
> Yes, we need to get more judicious about how we apply our perf efforts. :-)
>
>> On a related note, Will Lachance has been tasked with getting our
>> Eideticker performance measurement framework working with Windows, so we
>> should be able to experimentally measure user-visible responsiveness in
>> the near future.
>
> I'm curious to see what kinds of tests it will enable.
>
> Dave
> _______________________________________________
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to