Hello. I am a bit new to OSG and had a question.

What I want to do is offset a camera's display by a variable time delay. I
currently use a camera to render to an image and then a separate camera to
render that image to the screen. In the first camera's post draw callback
I swap images into a ring buffer so that the image displayed by the second
camera is a previously rendered frame.

This worked fine so long as I copied the image data, however, for
efficiency reasons I wanted to switch to an implementation that simply
rebound the images to the appropriate cameras during the callback. After
some testing I found that, while the second camera was rendering the
correct frame in the ring buffer to the screen, the first camera was not
rendering to the correct image. Specifically, I could not get it to render
to any image other than the first one I gave it.

This is what I am using to change the render target:

osgImage = newImage.get();
sceneCameraNode->detach(osg::CameraNode::COLOR_BUFFER);
sceneCameraNode->attach(osg::CameraNode::COLOR_BUFFER, osgImage.get());
osgTexture->setImage(osgImage.get());
osgTexture->dirtyTextureObject();

newImage is the (initialized) desired render target. osgImage is the prior
render target. newImage == osgImage fails.
sceneCameraNode is the CameraNode I want to render on to osgImage.
The image is bound to osgTexture.

Currently the system continues to render to the first image object I give
it. Because I use a ring buffer, this means I only see one out of every N
frames, where N is the size of the ring buffer.

I am unsure if the above code is correct or even if this is the best way
to go about this. Any help would be appreciated.

-Nicholas


_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to