Hi again,

Thanks for following up. I've tried it before with an attached pbuffer, but was 
- and still have been - getting unpredictable results. As before, if I simply 
place a camera as a node in the scene and attach a texture with FBO 
implementation, it will render the subgraph to the texture no problem. If I 
attach a texture to the main or slave camera of a view, whether I use a pbuffer 
for offscreen rendering or have a separate window, the texture will simply be 
filled with random data in video memory. The camera will otherwise work 
properly, as if I comment out any RTT-related lines and change the graphics 
context to the window's, it will render to a viewport without issues.

I would have simply used the currently working method and updated the camera's 
matrices directly, but we're using more complex manipulators and making that 
functionality redundant is an ugly solution. I took a look at examples like 
osgdistortion, and, implementation purposes aside, can't see any differences in 
camera and scene setup that might cause this. It clearly has to be me, given 
the fact that this exact functionality is used as an example and I've gotten 
the same result on two different computers (running Windows 7 on Nvidia 
GTX460/560).

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=45029#45029





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to