Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-07 Thread Bunfield, Dennis AMRDEC/AEgis
The glGet problem is totally separate from the HW latency and causes your 
rendering to become non-deterministic in that it may drop frames when frame 
locking to a specified rate.  The HW latency is deterministic, which is a good 
thing because knowing that our sims can adjust accordingly. We have tested the 
latency from the quadro 5500 through the 5800.  We have not tested any of the 
new fermi-based cards yet. The reason they have the latency is to optimize the 
pipeline and increase frame rate performance.  So at this point, I do not 
expect fermi-based cards to be any different.  Latency will be first thing we 
test when we get our hands on a Quadro 6000 later this year.



From: osg-users-boun...@lists.openscenegraph.org on behalf of Buckley, Bob CTR 
MDA/DES
Sent: Fri 8/6/2010 6:02 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)



Whom at nVidia confirmed this built in latency and for what product 
line?Inserting a glGet is inducing latency, not something built in.

Bob

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bunfield, 
Dennis AMRDEC/AEgis
Sent: Monday, June 28, 2010 12:39 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

Classification: UNCLASSIFIED
Caveats: FOUO

Yes we found this through internal testing.  Nvidia later confirmed it.
This isn't related to double or triple buffering either.  The pipeline
as explained to me is similar to a break line in a car.  Everything
works well unless you inject an air bubble into the brake line.  This
would be similar to doing certain glGet's commands. The driver will tell
the GPU to stop it's processing so that it can handle your glGet
request. So for real-time programming you really need to be aware of
this -- and don't do it --.  Depending upon the type of readback you are
performing you will introduce a momentary lag in the system because the
GPU has to stop everything it is doing to respond back to you.
glReadPixels behaves a little differently as long as the pixel format
aligns with the frame buffer format where the driver can just dma that
framebuffer back to the cpu. If your pixel formats are not aligned you
will get bad performance again, because the GPU will have to stop what
it is doing and reformat the data to send back.


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bruce
Wheaton
Sent: Monday, June 28, 2010 12:46 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:

 For Nvidia cards there is a built in 2 frames of latency.  So even
after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.

Where does this information come from Dennis? Where is this delay
happening? I doubt it's triple buffering, since the extra memory would
have to be accounted for, and it makes tearing mostly impossible (as on
the Mac).

So you believe the Gl command queue is buffered/delayed somewhere?
Doesn't that have huge implications for things like glGet, making them
impossible to use without at least halving the frame rate?

Bruce




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


winmail.dat___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-06 Thread Middleton, Colin (GE Intelligent Platforms)
Hi Bob,

The 2 frame latency thing didn't smell right to me either, having been
used SGI machines extensively in the past. However after doing various
closed loop latency tests, I'm convinced that the latency of modern
Nvidia cards is higher than I would have expected. The 2 frame latency
when OpenGL sync appears to account for this.

My tests involve an analog camera feeding into a framegrabber board,
which then downloads to texture and is displayed on the screen.
Overlayed on that is a frame time in seconds and milliseconds. By
pointing the camera at the screen you can create a tunnel of frame
times, the time between them is the closed loop latency. This
measurement techinique is probably accurate to ~10ms.

With Vysnc on, the latency is around 30-40ms worse than when it is off.
This is about 2 frames of 60Hz video worth. The draw time is less than a
millisecond, so the 2 frames latency is probably only 1-2ms with vsync
off.

As for ATI not letting nVidia get away with it, for both companies their
high end market is for first person shooter junkies willing to buy
multiple graphics cards and SLI them. From what I've read, most of these
people who care about latency have been playing with vsync off for over
ten years, so they wouldn't notice or care. Also the graphics benchmarks
tend to be run with vsync off, but they normally just deal with
throughput so wouldn't pick up latency anyway.

Both nvidia and ATI have been known to play dirty tricks with drivers to
improve benchmarks in the past. However I don't think this is one of
them, as frames of latency is a feature of DirectX, there is even a
setting in the control panel of some windows drivers to change it. I
don't think this setting changes the OpenGL behaviour though.

Colin.

PS I've only tested 7600 and 9400 properly, but I've seen symptoms of
latency on GeForce 8 series cards too. I think Geforce 5 and 6 didn't
have this 'feature' though. Interestingly, I also found that there were
some monitors that had over 66 milliseconds of extra latency.

 

 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org 
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
 Of Buckley, Bob CTR MDA/DES
 Sent: 06 August 2010 00:15
 To: OpenSceneGraph Users
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 
 The sync blocks on a swap buffers call, not on the first 
 OpenGL call (unless by accident it's an actual OpenGL blocking call).
 Non-blocking OpenGL calls run asynchronous on the back buffer 
 (traditional double buffering).
 A glGet call is an actual OpenGL blocking call as it forces a 
 pipe flush.
 Making blocking calls when trying to run real time thoroughly 
 defeats the purpose.
 
 Giovanni, what you're seeing is typical behavior when syncing 
 with the vertical retrace.
 To maintain real-time at 50Hz each frame must be rendered in 
 less than 20ms (1/50).
 If a frame just happens to take 21ms, then the buffer swap 
 will block for 19ms before actually swapping buffers.
 Hence, your frame rate is cut completely in half (21ms + 19ms 
 = 40ms = 25Hz).
 And it also introduces nasty temporal aliasing.
 
 I'm not aware of another way to synchronize with such 
 regularity as the monitor retrace.
 I'm guessing that deterministic hardware is required given 
 the nature of something like OpenSceneGraph on a PC.
 Although, you can achieve near real-time by things like 
 database tuning and pre-emptive rendering.
 But, nothing to guarantee actual real time.
 
 Extra hardware is not needed to run at multiples of the frame rate.
 Just set the retrace rate to the desired frame rate and run sync'd.
 Boom - multiples of the frame rate.
 
 
 BTW, there's something about this alleged '2 frame latency' 
 charge that just doesn't pass the smell test.
 Mostly - ATI sure the hell wouldn't let 'em!
 
 Bob
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org 
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
 Of Bunfield, Dennis AMRDEC/AEgis
 Sent: Wednesday, June 23, 2010 4:35 PM
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 
 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 The sync actually blocks on the first open gl call.  So your 
 osg update and cull stages will run then you will block on 
 the draw stage until the vsync.  Your problem is actually 
 worse than 20ms w/o you knowing it.
 For Nvidia cards there is a built in 2 frames of latency.  So 
 even after your sync you won't see the image you updated come 
 out the DVI until 2 render frames later.
 
 In order for you to do what you want you will need some 
 expensive frame lock hardware with external syncs to run at a 
 multiple of the frame rate.
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf 
 Of Giovanni Ferrari
 Sent: Wednesday, June 23, 2010 3:28 AM
 To: osg-users@lists.openscenegraph.org

Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-06 Thread Jason Daly

Middleton, Colin (GE Intelligent Platforms) wrote:

PS I've only tested 7600 and 9400 properly, but I've seen symptoms of
latency on GeForce 8 series cards too. I think Geforce 5 and 6 didn't
have this 'feature' though. Interestingly, I also found that there were
some monitors that had over 66 milliseconds of extra latency.
  


From what I've read, newer monitors tend to have a lot more latency 
than the old CRTs did.  My TV at home (a Samsung LED DLP) has almost a 
half second of latency!  It's one of those numbers that I wish more 
manufacturers published.


--J

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-06 Thread Terry Welsh
http://www.digitalversus.com
This is a good resource for finding monitor input lag rates.  Go to
Product Face-Offs-LCD Computer Monitor, then choose a couple
monitors, and then select the test Input Lag vs a CRT.  This site was
indispensable for picking out my last monitor, a Dell U2410.  It
rocks.  (Just so you know, I don't work for digitalversus.com or
Dell).
--
Terry Welsh
mogumbo 'at' gmail.com
www.reallyslick.com



 Message: 4
 Date: Fri, 6 Aug 2010 11:26:09 -0400
 From: Jason Daly jd...@ist.ucf.edu
 To: OpenSceneGraph Users osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 Message-ID: 4c5c2991.1000...@ist.ucf.edu
 Content-Type: text/plain; charset=ISO-8859-1; format=flowed

 Middleton, Colin (GE Intelligent Platforms) wrote:
 PS I've only tested 7600 and 9400 properly, but I've seen symptoms of
 latency on GeForce 8 series cards too. I think Geforce 5 and 6 didn't
 have this 'feature' though. Interestingly, I also found that there were
 some monitors that had over 66 milliseconds of extra latency.


  From what I've read, newer monitors tend to have a lot more latency
 than the old CRTs did.  My TV at home (a Samsung LED DLP) has almost a
 half second of latency!  It's one of those numbers that I wish more
 manufacturers published.

 --J

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-06 Thread Buckley, Bob CTR MDA/DES
Whom at nVidia confirmed this built in latency and for what product 
line?Inserting a glGet is inducing latency, not something built in.

Bob

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bunfield, 
Dennis AMRDEC/AEgis
Sent: Monday, June 28, 2010 12:39 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

Classification: UNCLASSIFIED
Caveats: FOUO

Yes we found this through internal testing.  Nvidia later confirmed it.
This isn't related to double or triple buffering either.  The pipeline
as explained to me is similar to a break line in a car.  Everything
works well unless you inject an air bubble into the brake line.  This
would be similar to doing certain glGet's commands. The driver will tell
the GPU to stop it's processing so that it can handle your glGet
request. So for real-time programming you really need to be aware of
this -- and don't do it --.  Depending upon the type of readback you are
performing you will introduce a momentary lag in the system because the
GPU has to stop everything it is doing to respond back to you.
glReadPixels behaves a little differently as long as the pixel format
aligns with the frame buffer format where the driver can just dma that
framebuffer back to the cpu. If your pixel formats are not aligned you
will get bad performance again, because the GPU will have to stop what
it is doing and reformat the data to send back.


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bruce
Wheaton
Sent: Monday, June 28, 2010 12:46 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:

 For Nvidia cards there is a built in 2 frames of latency.  So even
after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.

Where does this information come from Dennis? Where is this delay
happening? I doubt it's triple buffering, since the extra memory would
have to be accounted for, and it makes tearing mostly impossible (as on
the Mac).

So you believe the Gl command queue is buffered/delayed somewhere?
Doesn't that have huge implications for things like glGet, making them
impossible to use without at least halving the frame rate?

Bruce




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-05 Thread Buckley, Bob CTR MDA/DES
The sync blocks on a swap buffers call, not on the first OpenGL call (unless by 
accident it's an actual OpenGL blocking call).
Non-blocking OpenGL calls run asynchronous on the back buffer (traditional 
double buffering).
A glGet call is an actual OpenGL blocking call as it forces a pipe flush.
Making blocking calls when trying to run real time thoroughly defeats the 
purpose.

Giovanni, what you're seeing is typical behavior when syncing with the vertical 
retrace.
To maintain real-time at 50Hz each frame must be rendered in less than 20ms 
(1/50).
If a frame just happens to take 21ms, then the buffer swap will block for 19ms 
before actually swapping buffers.
Hence, your frame rate is cut completely in half (21ms + 19ms = 40ms = 25Hz).
And it also introduces nasty temporal aliasing.

I'm not aware of another way to synchronize with such regularity as the monitor 
retrace.
I'm guessing that deterministic hardware is required given the nature of 
something like OpenSceneGraph on a PC.
Although, you can achieve near real-time by things like database tuning and 
pre-emptive rendering.
But, nothing to guarantee actual real time.

Extra hardware is not needed to run at multiples of the frame rate.
Just set the retrace rate to the desired frame rate and run sync'd.
Boom - multiples of the frame rate.


BTW, there's something about this alleged '2 frame latency' charge that just 
doesn't pass the smell test.
Mostly - ATI sure the hell wouldn't let 'em!

Bob

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bunfield, 
Dennis AMRDEC/AEgis
Sent: Wednesday, June 23, 2010 4:35 PM
To: osg-users@lists.openscenegraph.org
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

Classification: UNCLASSIFIED
Caveats: FOUO

The sync actually blocks on the first open gl call.  So your osg update
and cull stages will run then you will block on the draw stage until the
vsync.  Your problem is actually worse than 20ms w/o you knowing it.
For Nvidia cards there is a built in 2 frames of latency.  So even after
your sync you won't see the image you updated come out the DVI until 2
render frames later.

In order for you to do what you want you will need some expensive frame
lock hardware with external syncs to run at a multiple of the frame
rate.

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
Giovanni Ferrari
Sent: Wednesday, June 23, 2010 3:28 AM
To: osg-users@lists.openscenegraph.org
Subject: [osg-users] OSG and vsync

Hi,

i'm developing an application that need vsync to avoid tearing but i'm
having some problems.
Here is the (really simplified) code structure of my application: 


Code:

...
while ( !mViewer-done() ) {
  mGraph-Update(...);
  mViewer-frame();
}





I've noticed frame() function is blocking when vsync is enabled.
This means that i call the frame function, i stay blocked for something
like 20ms (50hz PAL), and then i must terminate the Update call in the
vsync period otherwise i won't be able to draw a new frame ( the graphic
card draw the old content of frame buffer without changes performed in
the update function. Changes will be performed in the next frame ).

As you can immagine this is a big problem for real-time applications
cause i'm introducing 20ms of retard.

Is there a way to syncronize frame call without using vsync ? or can i
scompose the frame function to be able to separate functions that
operate on the graph from those that perform the rendering ? 
The second solution could help me cause i would be able to operate on
the graph with mGraph-Update(...) even if the frame part that write
the frameBuffer is blocked.

I hope i've explained my problem clearly.

Thank you!

Cheers,
Giovanni

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=29295#29295





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-05 Thread Lilith Bryant
My own research indicates that the timing rules are:

The first opengl call will block if the back buffer is not available at that 
point.

The swapbuffers will block if there is already a pending swap queued (by the 
last call to swapbuffers)

So, depending on the circumstances, both can block.

Lilith

On 2010-08-06 11:15:14 AM, Buckley, Bob CTR MDA/DES wrote:
 The sync blocks on a swap buffers call, not on the first OpenGL call 
 (unless
 by accident it's an actual OpenGL blocking call).
 Non-blocking OpenGL calls run asynchronous on the back buffer (traditional
 double buffering).
 A glGet call is an actual OpenGL blocking call as it forces a pipe flush.
 Making blocking calls when trying to run real time thoroughly defeats the
 purpose.
 
 Giovanni, what you're seeing is typical behavior when syncing with the
 vertical retrace.
 To maintain real-time at 50Hz each frame must be rendered in less than 20ms
 (1/50).
 If a frame just happens to take 21ms, then the buffer swap will block for
 19ms before actually swapping buffers.
 Hence, your frame rate is cut completely in half (21ms + 19ms = 40ms =
 25Hz).
 And it also introduces nasty temporal aliasing.
 
 I'm not aware of another way to synchronize with such regularity as the
 monitor retrace.
 I'm guessing that deterministic hardware is required given the nature of
 something like OpenSceneGraph on a PC.
 Although, you can achieve near real-time by things like database tuning and
 pre-emptive rendering.
 But, nothing to guarantee actual real time.
 
 Extra hardware is not needed to run at multiples of the frame rate.
 Just set the retrace rate to the desired frame rate and run sync'd.
 Boom - multiples of the frame rate.
 
 
 BTW, there's something about this alleged '2 frame latency' charge that 
 just
 doesn't pass the smell test.
 Mostly - ATI sure the hell wouldn't let 'em!
 
 Bob
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bunfield,
 Dennis AMRDEC/AEgis
 Sent: Wednesday, June 23, 2010 4:35 PM
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 
 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 The sync actually blocks on the first open gl call.  So your osg update
 and cull stages will run then you will block on the draw stage until the
 vsync.  Your problem is actually worse than 20ms w/o you knowing it.
 For Nvidia cards there is a built in 2 frames of latency.  So even after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.
 
 In order for you to do what you want you will need some expensive frame
 lock hardware with external syncs to run at a multiple of the frame
 rate.
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
 Giovanni Ferrari
 Sent: Wednesday, June 23, 2010 3:28 AM
 To: osg-users@lists.openscenegraph.org
 Subject: [osg-users] OSG and vsync
 
 Hi,
 
 i'm developing an application that need vsync to avoid tearing but i'm
 having some problems.
 Here is the (really simplified) code structure of my application: 
 
 
 Code:
 
 ...
 while ( !mViewer-done() ) {
   mGraph-Update(...);
   mViewer-frame();
 }
 
 
 
 
 
 I've noticed frame() function is blocking when vsync is enabled.
 This means that i call the frame function, i stay blocked for something
 like 20ms (50hz PAL), and then i must terminate the Update call in the
 vsync period otherwise i won't be able to draw a new frame ( the graphic
 card draw the old content of frame buffer without changes performed in
 the update function. Changes will be performed in the next frame ).
 
 As you can immagine this is a big problem for real-time applications
 cause i'm introducing 20ms of retard.
 
 Is there a way to syncronize frame call without using vsync ? or can i
 scompose the frame function to be able to separate functions that
 operate on the graph from those that perform the rendering ? 
 The second solution could help me cause i would be able to operate on
 the graph with mGraph-Update(...) even if the frame part that write
 the frameBuffer is blocked.
 
 I hope i've explained my problem clearly.
 
 Thank you!
 
 Cheers,
 Giovanni
 
 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=29295#29295
 
 
 
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
 g
 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org

Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-08-05 Thread Buckley, Bob CTR MDA/DES
I'm talking traditional/typical double buffering, not special cases.


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Lilith Bryant
Sent: Thursday, August 05, 2010 5:27 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

My own research indicates that the timing rules are:

The first opengl call will block if the back buffer is not available at that 
point.

The swapbuffers will block if there is already a pending swap queued (by the 
last call to swapbuffers)

So, depending on the circumstances, both can block.

Lilith

On 2010-08-06 11:15:14 AM, Buckley, Bob CTR MDA/DES wrote:
 The sync blocks on a swap buffers call, not on the first OpenGL call 
 (unless
 by accident it's an actual OpenGL blocking call).
 Non-blocking OpenGL calls run asynchronous on the back buffer (traditional
 double buffering).
 A glGet call is an actual OpenGL blocking call as it forces a pipe flush.
 Making blocking calls when trying to run real time thoroughly defeats the
 purpose.
 
 Giovanni, what you're seeing is typical behavior when syncing with the
 vertical retrace.
 To maintain real-time at 50Hz each frame must be rendered in less than 20ms
 (1/50).
 If a frame just happens to take 21ms, then the buffer swap will block for
 19ms before actually swapping buffers.
 Hence, your frame rate is cut completely in half (21ms + 19ms = 40ms =
 25Hz).
 And it also introduces nasty temporal aliasing.
 
 I'm not aware of another way to synchronize with such regularity as the
 monitor retrace.
 I'm guessing that deterministic hardware is required given the nature of
 something like OpenSceneGraph on a PC.
 Although, you can achieve near real-time by things like database tuning and
 pre-emptive rendering.
 But, nothing to guarantee actual real time.
 
 Extra hardware is not needed to run at multiples of the frame rate.
 Just set the retrace rate to the desired frame rate and run sync'd.
 Boom - multiples of the frame rate.
 
 
 BTW, there's something about this alleged '2 frame latency' charge that 
 just
 doesn't pass the smell test.
 Mostly - ATI sure the hell wouldn't let 'em!
 
 Bob
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bunfield,
 Dennis AMRDEC/AEgis
 Sent: Wednesday, June 23, 2010 4:35 PM
 To: osg-users@lists.openscenegraph.org
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 
 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 The sync actually blocks on the first open gl call.  So your osg update
 and cull stages will run then you will block on the draw stage until the
 vsync.  Your problem is actually worse than 20ms w/o you knowing it.
 For Nvidia cards there is a built in 2 frames of latency.  So even after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.
 
 In order for you to do what you want you will need some expensive frame
 lock hardware with external syncs to run at a multiple of the frame
 rate.
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
 Giovanni Ferrari
 Sent: Wednesday, June 23, 2010 3:28 AM
 To: osg-users@lists.openscenegraph.org
 Subject: [osg-users] OSG and vsync
 
 Hi,
 
 i'm developing an application that need vsync to avoid tearing but i'm
 having some problems.
 Here is the (really simplified) code structure of my application: 
 
 
 Code:
 
 ...
 while ( !mViewer-done() ) {
   mGraph-Update(...);
   mViewer-frame();
 }
 
 
 
 
 
 I've noticed frame() function is blocking when vsync is enabled.
 This means that i call the frame function, i stay blocked for something
 like 20ms (50hz PAL), and then i must terminate the Update call in the
 vsync period otherwise i won't be able to draw a new frame ( the graphic
 card draw the old content of frame buffer without changes performed in
 the update function. Changes will be performed in the next frame ).
 
 As you can immagine this is a big problem for real-time applications
 cause i'm introducing 20ms of retard.
 
 Is there a way to syncronize frame call without using vsync ? or can i
 scompose the frame function to be able to separate functions that
 operate on the graph from those that perform the rendering ? 
 The second solution could help me cause i would be able to operate on
 the graph with mGraph-Update(...) even if the frame part that write
 the frameBuffer is blocked.
 
 I hope i've explained my problem clearly.
 
 Thank you!
 
 Cheers,
 Giovanni
 
 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=29295#29295
 
 
 
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
 g
 Classification

Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-28 Thread Bunfield, Dennis AMRDEC/AEgis
Classification: UNCLASSIFIED
Caveats: FOUO

Nvidia chips use a very long pipeline and the GPU's like processing
data.  When they are not processing data they have a hard time
performing.  In order to keep the performance up, NVIDIA introduces the
2 frames latency to keep the pipeline full.  Most likely the 40ms of
retard you are noticing is your 2 frames latency.  You will probably
only notice the 2 frame latency when it is vsync'ed.  When you are not
vsync'ed you will get the tearing but you are also running faster than
the sync rate and probably won't notice that built-in latency.

Hope that helps...

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
Giovanni Ferrari
Sent: Thursday, June 24, 2010 1:04 AM
To: osg-users@lists.openscenegraph.org
Subject: Re: [osg-users] OSG and vsync

Thank you very much for your reply ! 
I understand your suggestion about using external syncs to run at a
multiple of the frame rate but i i don't understand the built in 2
frames of latency in Nvidia cards. 

I tell you this cause i used my system both with and without vsync and i
noticed than running with vsync i don't have tearing but i measure extra
40ms of retard in the total closed loop respect to the free run case (no
vsync). (with total closed loop i mean image rendering time +
transmission time + elaboration of frame time + results_time)

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=29347#29347





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-28 Thread Bruce Wheaton
On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:

 For Nvidia cards there is a built in 2 frames of latency.  So even after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.

Where does this information come from Dennis? Where is this delay happening? I 
doubt it's triple buffering, since the extra memory would have to be accounted 
for, and it makes tearing mostly impossible (as on the Mac).

So you believe the Gl command queue is buffered/delayed somewhere? Doesn't that 
have huge implications for things like glGet, making them impossible to use 
without at least halving the frame rate?

Bruce




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-28 Thread Bunfield, Dennis AMRDEC/AEgis
Classification: UNCLASSIFIED
Caveats: FOUO

Yes we found this through internal testing.  Nvidia later confirmed it.
This isn't related to double or triple buffering either.  The pipeline
as explained to me is similar to a break line in a car.  Everything
works well unless you inject an air bubble into the brake line.  This
would be similar to doing certain glGet's commands. The driver will tell
the GPU to stop it's processing so that it can handle your glGet
request. So for real-time programming you really need to be aware of
this -- and don't do it --.  Depending upon the type of readback you are
performing you will introduce a momentary lag in the system because the
GPU has to stop everything it is doing to respond back to you.
glReadPixels behaves a little differently as long as the pixel format
aligns with the frame buffer format where the driver can just dma that
framebuffer back to the cpu. If your pixel formats are not aligned you
will get bad performance again, because the GPU will have to stop what
it is doing and reformat the data to send back.


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bruce
Wheaton
Sent: Monday, June 28, 2010 12:46 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:

 For Nvidia cards there is a built in 2 frames of latency.  So even
after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.

Where does this information come from Dennis? Where is this delay
happening? I doubt it's triple buffering, since the extra memory would
have to be accounted for, and it makes tearing mostly impossible (as on
the Mac).

So you believe the Gl command queue is buffered/delayed somewhere?
Doesn't that have huge implications for things like glGet, making them
impossible to use without at least halving the frame rate?

Bruce




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-28 Thread Bunfield, Dennis AMRDEC/AEgis
Classification: UNCLASSIFIED
Caveats: FOUO

If you are interested in seeing what we did WRT latency testing go to
this link

http://www.ccur.com/Libraries/docs_pdf/AMRDEC_PC_Scenegeneration_whitepa
per.pdf


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
Bunfield, Dennis AMRDEC/AEgis
Sent: Monday, June 28, 2010 1:39 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

Classification: UNCLASSIFIED
Caveats: FOUO

Yes we found this through internal testing.  Nvidia later confirmed it.
This isn't related to double or triple buffering either.  The pipeline
as explained to me is similar to a break line in a car.  Everything
works well unless you inject an air bubble into the brake line.  This
would be similar to doing certain glGet's commands. The driver will tell
the GPU to stop it's processing so that it can handle your glGet
request. So for real-time programming you really need to be aware of
this -- and don't do it --.  Depending upon the type of readback you are
performing you will introduce a momentary lag in the system because the
GPU has to stop everything it is doing to respond back to you.
glReadPixels behaves a little differently as long as the pixel format
aligns with the frame buffer format where the driver can just dma that
framebuffer back to the cpu. If your pixel formats are not aligned you
will get bad performance again, because the GPU will have to stop what
it is doing and reformat the data to send back.


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bruce
Wheaton
Sent: Monday, June 28, 2010 12:46 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)

On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:

 For Nvidia cards there is a built in 2 frames of latency.  So even
after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.

Where does this information come from Dennis? Where is this delay
happening? I doubt it's triple buffering, since the extra memory would
have to be accounted for, and it makes tearing mostly impossible (as on
the Mac).

So you believe the Gl command queue is buffered/delayed somewhere?
Doesn't that have huge implications for things like glGet, making them
impossible to use without at least halving the frame rate?

Bruce




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-28 Thread Bruce Wheaton
Thanks for the info Dennis, I'll take a look at your paper too. Just surprised 
the latency can be that much, but it really explains why glGet's are quite so 
awful!

Bruce
 
On Jun 28, 2010, at 11:38 AM, Bunfield, Dennis AMRDEC/AEgis wrote:

 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 Yes we found this through internal testing.  Nvidia later confirmed it.
 This isn't related to double or triple buffering either.  The pipeline
 as explained to me is similar to a break line in a car.  Everything
 works well unless you inject an air bubble into the brake line.  This
 would be similar to doing certain glGet's commands. The driver will tell
 the GPU to stop it's processing so that it can handle your glGet
 request. So for real-time programming you really need to be aware of
 this -- and don't do it --.  Depending upon the type of readback you are
 performing you will introduce a momentary lag in the system because the
 GPU has to stop everything it is doing to respond back to you.
 glReadPixels behaves a little differently as long as the pixel format
 aligns with the frame buffer format where the driver can just dma that
 framebuffer back to the cpu. If your pixel formats are not aligned you
 will get bad performance again, because the GPU will have to stop what
 it is doing and reformat the data to send back.
 
 
 -Original Message-
 From: osg-users-boun...@lists.openscenegraph.org
 [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Bruce
 Wheaton
 Sent: Monday, June 28, 2010 12:46 PM
 To: OpenSceneGraph Users
 Subject: Re: [osg-users] OSG and vsync (UNCLASSIFIED)
 
 On Jun 23, 2010, at 3:35 PM, Bunfield, Dennis AMRDEC/AEgis wrote:
 
 For Nvidia cards there is a built in 2 frames of latency.  So even
 after
 your sync you won't see the image you updated come out the DVI until 2
 render frames later.
 
 Where does this information come from Dennis? Where is this delay
 happening? I doubt it's triple buffering, since the extra memory would
 have to be accounted for, and it makes tearing mostly impossible (as on
 the Mac).
 
 So you believe the Gl command queue is buffered/delayed somewhere?
 Doesn't that have huge implications for things like glGet, making them
 impossible to use without at least halving the frame rate?
 
 Bruce
 
 
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
 g
 Classification: UNCLASSIFIED
 Caveats: FOUO
 
 
 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] OSG and vsync (UNCLASSIFIED)

2010-06-23 Thread Bunfield, Dennis AMRDEC/AEgis
Classification: UNCLASSIFIED
Caveats: FOUO

The sync actually blocks on the first open gl call.  So your osg update
and cull stages will run then you will block on the draw stage until the
vsync.  Your problem is actually worse than 20ms w/o you knowing it.
For Nvidia cards there is a built in 2 frames of latency.  So even after
your sync you won't see the image you updated come out the DVI until 2
render frames later.

In order for you to do what you want you will need some expensive frame
lock hardware with external syncs to run at a multiple of the frame
rate.

-Original Message-
From: osg-users-boun...@lists.openscenegraph.org
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of
Giovanni Ferrari
Sent: Wednesday, June 23, 2010 3:28 AM
To: osg-users@lists.openscenegraph.org
Subject: [osg-users] OSG and vsync

Hi,

i'm developing an application that need vsync to avoid tearing but i'm
having some problems.
Here is the (really simplified) code structure of my application: 


Code:

...
while ( !mViewer-done() ) {
  mGraph-Update(...);
  mViewer-frame();
}





I've noticed frame() function is blocking when vsync is enabled.
This means that i call the frame function, i stay blocked for something
like 20ms (50hz PAL), and then i must terminate the Update call in the
vsync period otherwise i won't be able to draw a new frame ( the graphic
card draw the old content of frame buffer without changes performed in
the update function. Changes will be performed in the next frame ).

As you can immagine this is a big problem for real-time applications
cause i'm introducing 20ms of retard.

Is there a way to syncronize frame call without using vsync ? or can i
scompose the frame function to be able to separate functions that
operate on the graph from those that perform the rendering ? 
The second solution could help me cause i would be able to operate on
the graph with mGraph-Update(...) even if the frame part that write
the frameBuffer is blocked.

I hope i've explained my problem clearly.

Thank you!

Cheers,
Giovanni

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=29295#29295





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.or
g
Classification: UNCLASSIFIED
Caveats: FOUO


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org