Hi Bob,

The 2 frame latency thing didn't smell right to me either, having been
used SGI machines extensively in the past. However after doing various
closed loop latency tests, I'm convinced that the latency of modern
Nvidia cards is higher than I would have expected. The 2 frame latency
when OpenGL sync appears to account for this.

My tests involve an analog camera feeding into a framegrabber board,
which then downloads to texture and is displayed on the screen.
Overlayed on that is a frame time in seconds and milliseconds. By
pointing the camera at the screen you can create a tunnel of frame
times, the time between them is the closed loop latency. This
measurement techinique is probably accurate to ~10ms.

With Vysnc on, the latency is around 30-40ms worse than when it is off.
This is about 2 frames of 60Hz video worth. The draw time is less than a
millisecond, so the 2 frames latency is probably only 1-2ms with vsync
off.

As for ATI not letting nVidia get away with it, for both companies their
high end market is for first person shooter junkies willing to buy
multiple graphics cards and SLI them. From what I've read, most of these
people who care about latency have been playing with vsync off for over
ten years, so they wouldn't notice or care. Also the graphics benchmarks
tend to be run with vsync off, but they normally just deal with
throughput so wouldn't pick up latency anyway.

Both nvidia and ATI have been known to play dirty tricks with drivers to
improve benchmarks in the past. However I don't think this is one of
them, as frames of latency is a feature of DirectX, there is even a
setting in the control panel of some windows drivers to change it. I
don't think this setting changes the OpenGL behaviour though.

Colin.

PS I've only tested 7600 and 9400 properly, but I've seen symptoms of
latency on GeForce 8 series cards too. I think Geforce 5 and 6 didn't
have this 'feature' though. Interestingly, I also found that there were
some monitors that had over 66 milliseconds of extra latency.

 

> -----Original Message-----
> From: osg-users-boun...@lists.openscenegraph.org 
> [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
> Of Buckley, Bob CTR MDA/DES
> Sent: 06 August 2010 00:15
> To: OpenSceneGraph Users
> Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
> 
> The sync blocks on a swap buffers call, not on the first 
> OpenGL call (unless by accident it's an actual OpenGL blocking call).
> Non-blocking OpenGL calls run asynchronous on the back buffer 
> (traditional double buffering).
> A glGet call is an actual OpenGL blocking call as it forces a 
> pipe flush.
> Making blocking calls when trying to run real time thoroughly 
> defeats the purpose.
> 
> Giovanni, what you're seeing is typical behavior when syncing 
> with the vertical retrace.
> To maintain real-time at 50Hz each frame must be rendered in 
> less than 20ms (1/50).
> If a frame just happens to take 21ms, then the buffer swap 
> will block for 19ms before actually swapping buffers.
> Hence, your frame rate is cut completely in half (21ms + 19ms 
> = 40ms = 25Hz).
> And it also introduces nasty temporal aliasing.
> 
> I'm not aware of another way to synchronize with such 
> regularity as the monitor retrace.
> I'm guessing that deterministic hardware is required given 
> the nature of something like OpenSceneGraph on a PC.
> Although, you can achieve near real-time by things like 
> database tuning and pre-emptive rendering.
> But, nothing to guarantee actual real time.
> 
> Extra hardware is not needed to run at multiples of the frame rate.
> Just set the retrace rate to the desired frame rate and run sync'd.
> Boom - multiples of the frame rate.
> 
> 
> BTW, there's something about this alleged '2 frame latency' 
> charge that just doesn't pass the smell test.
> Mostly - ATI sure the hell wouldn't let 'em!
> 
> Bob
> 
> -----Original Message-----
> From: osg-users-boun...@lists.openscenegraph.org 
> [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
> Of Bunfield, Dennis AMRDEC/AEgis
> Sent: Wednesday, June 23, 2010 4:35 PM
> To: osg-users@lists.openscenegraph.org
> Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
> 
> Classification: UNCLASSIFIED
> Caveats: FOUO
> 
> The sync actually blocks on the first open gl call.  So your 
> osg update and cull stages will run then you will block on 
> the draw stage until the vsync.  Your problem is actually 
> worse than 20ms w/o you knowing it.
> For Nvidia cards there is a built in 2 frames of latency.  So 
> even after your sync you won't see the image you updated come 
> out the DVI until 2 render frames later.
> 
> In order for you to do what you want you will need some 
> expensive frame lock hardware with external syncs to run at a 
> multiple of the frame rate.
> 
> -----Original Message-----
> From: osg-users-boun...@lists.openscenegraph.org
> [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
> Of Giovanni Ferrari
> Sent: Wednesday, June 23, 2010 3:28 AM
> To: osg-users@lists.openscenegraph.org
> Subject: [osg-users] OSG and vsync
> 
> Hi,
> 
> i'm developing an application that need vsync to avoid 
> tearing but i'm having some problems.
> Here is the (really simplified) code structure of my application: 
> 
> 
> Code:
> 
> ...
> while ( !mViewer->done() ) {
>   mGraph->Update(...);
>   mViewer->frame();
> }
> 
> 
> 
> 
> 
> I've noticed frame() function is blocking when vsync is enabled.
> This means that i call the frame function, i stay blocked for 
> something like 20ms (50hz PAL), and then i must terminate the 
> Update call in the vsync period otherwise i won't be able to 
> draw a new frame ( the graphic card draw the old content of 
> frame buffer without changes performed in the update 
> function. Changes will be performed in the next frame ).
> 
> As you can immagine this is a big problem for real-time 
> applications cause i'm introducing 20ms of retard.
> 
> Is there a way to syncronize frame call without using vsync ? 
> or can i scompose the frame function to be able to separate 
> functions that operate on the graph from those that perform 
> the rendering ? 
> The second solution could help me cause i would be able to 
> operate on the graph with mGraph->Update(...) even if the 
> frame "part" that write the frameBuffer is blocked.
> 
> I hope i've explained my problem clearly.
> 
> Thank you!
> 
> Cheers,
> Giovanni
> 
> ------------------
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=29295#29295
> 
> 
> 
> 
> 
> _______________________________________________
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-opensce
> negraph.or
> g
> Classification: UNCLASSIFIED
> Caveats: FOUO
> 
> 
> _______________________________________________
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-opensce
> negraph.org
> _______________________________________________
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-opensce
> negraph.org
> 
_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to