Hi All,

One of the challenges with optimizing modern graphics apps is that
there is complex mix of parameters to
paging/viewers/database/threading experiment with and also huge range
of ways to measure performance/quality.  Things like frame rates are
often a metric that graphics developers use to measure progress, but
for professional sims my guess is that its not the best metric we
could come up with, hence this email.

So what metrics could we come up with to measure how well the system
is performing?  Here's a few to start with

1) Frame rate with vsync off.
2) Frame rate with vysnc off.
3) Number of dropped frames i.e. frames below a specified cut off.
4) Histogram of frame rates.
5) Latency, Number of frames/absolute time it takes to load a tile
once its requested.
6) Plots of update, cull, draw, database pager times per frame against
frame number.
7) A frame drop risk factor, i.e. how close did the frame come to being dropped.
8) Latency between beginning of the frame and the final swap buffers
happening on the
   graphics card.

Can you think of any more?

It'd be good if we could come up with a overall quality metric that we
could apply to whole test run.  This would also entail a control
application run.  Animation paths gets us some way down this route,
but only part way.

There is also the issue of how to record all the stats for a run, and
how to review them afterwards.

I don't have quick answers to this, and I'm not to about to volunteer
myself with coming up with such a framework (at least straight away
:-)  but I can see the need for it, so though I'd punt the topic out
there for others to chip with thoughts on the way forward.  Volunteers
welcome too :-)

Robert.
_______________________________________________
osg-users mailing list
osg-users@openscenegraph.net
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to