Hello,

I am using OSG for a robotics simulator [1] already a while. Now I want to 
implement a "camera sensor", meaning the simulation of a physical camera 
mounted on a robot. 
I got it to work with Render to Texture. However I have the following problem:
The rendering of the normal graphics is detached from the physical simulation 
speed. E.g. I can run the physics as fast as I can and render still only 
25fps. 
However, the camera sensor needs to be synchronized with the physical 
simulation. 
How can I render the scene explicitly into the texture indepently of the 
viewer? At the moment the scene graph looks as follows:
Viewer with main cam 
  - Root  
    - HUD display (with another camera overlay to see the texture)
    - Camera sensor (RRT) 
         - World ....
    - World 
        - Shadow
           - Scene

I have also problems with the shadow when the RRT camera is used. I read [2-3] 
to use slave views, but did not success. 

Any suggestions would be highly appreciated. 
 
Cheers,
        Georg

[1] http://robot.informatik.uni-leipzig.de/videos
[2] http://www.mail-archive.com/osg-
us...@lists.openscenegraph.org/msg33512.html
[3] http://www.mail-archive.com/osg-
us...@lists.openscenegraph.org/msg36229.html
-- 
---- Georg Martius,  Tel: +49 177 6413311  -----
------- http://www.flexman.homeip.net ----------

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to