Re: [osg-users] [osgOcean] Cylinder/Skydome flicker when using CompositeView
Hi, I've done some more poking around and I've found out a few things. If I disable the cylinder completely by simply not adding the _oceanCylinderMT child to the _oceanTransform group in OceanScene.cpp, then the flashing stops. The flashing occurs over the water. The effect is seemingly random, but the color that is shown seems to be the _aboveWaterFogColor, which is odd because the cylinder uses the _underWaterWaterFogColor. I set the above to pure green and the under to pure red and the color that flashes is green. The problem goes away if I reduce my cylinder/skydome size. My application requires a far viewing distance so I've set the respective sizes to 30,000. When I bump the size back down to 1,900 (as is set in the ocean example), the flashing doesn't occur. However, I'd like the flexibility to set my viewing distance to be much farther than 1,900. Thank you! Cheers, Ben -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=63162#63162 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] [osgOcean] Cylinder/Skydome flicker when using CompositeView
Hi, I'm using osgOcean in a multi-view, multi-graphics card configuration and I'm encountering a problem with the ocean's cylinder and skydome causing flickering effects. Setup My setup is that I have 6 cameras all looking into the same scene. They all have the same position but different orientations and frustums. They all render to different screens and they're all on their own graphics card. I've disabled most advanced rendering effects such as: Refractions God rays Silt Underwater DOF Glare Distortion I have the latest bits from the osgOcean repository and have no problem compiling them. I'm using FFTSS 3.0. Main Problem My problem is that with multiple views and one ocean, I'm seeing that the skydome and the cylinder tend to move around as if they're on a different screen. For example, when I'm viewing the ocean at an angle (some degree of roll), I tend to see one view's cylinder geometry appear on another view's scene. This produces a brief one-frame flicker effect that is very distracting. With the skydome, I imagine a similar problem is happening (due to how they're both updated per view) but it just seems to flicker in and out of visibility. Attempted Solution I figured this was a threading issue where one thread would use the values of another thread's operations. With that in mind, I added multiple CameraTrackCallbacks to the ocean scene, each referencing the main camera of one of the views. In operator(), I would check the stored Camera pointer against the camera obtained through cv-getRenderStage()-getCamera() to see if the pointers were equal. If so, I would continue to update the cylinder position. If not, I would continue to traverse. This seemed to work great for the skydome (no more flickering!) but not as well for the cylinder. Example code for adding a new CameraTrackCallback for the skydome: Code: void attachCamera(osg::Camera* camera) { osg::MatrixTransform* transform = new osg::MatrixTransform; transform-setDataVariance(osg::Object::DYNAMIC); transform-setMatrix(osg::Matrixf::translate(osg::Vec3f(0.f, 0.f, 0.f))); transform-setCullCallback(new CameraTrackCallback(camera)); transform-addChild(m_skyDome.get()); m_oceanScene-addChild(transform); m_oceanScene-addCamera(camera); } Side Problem (with solution!) A problem I did find a solution to was just using osgOcean with a Composite Viewer set to ThreadPerContext or ThreadPerCamera. In OceanScene.cpp's CameraTrackCallback::createOrReuseMatrix, I tended to get deletions on allocated matrices that were already deleted within moments of running my app. I threw in a mutex that would scope lock within the function which fixed this but not the flicker. Question Has anyone else encountered this issue? Maybe I'm setting up my ocean wrong. I'm using a modified version of the Scene in the oceanExample project to set up the OceanScene. Thank you! Cheers, Ben[/list] -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=63144#63144 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Slaves, RTT, and HUD
Hey Robert, Thanks for the reply. I've finally gotten around to implementing a solution to this problem. Unfortunately, when I added the text HUD camera as a child to the RTT main camera, I still wasn't seeing my HUD. However, if I turn the HUD camera into another RTT camera and attach it to the texture of the main RTT camera (with only the GL_DEPTH_BUFFER_BIT set as the clear mask) then it works great! From: osg-users [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Robert Osfield Sent: Wednesday, November 05, 2014 12:42 AM To: OpenSceneGraph Users Subject: Re: [osg-users] Slaves, RTT, and HUD Hi Ben, Try placing the Text HUD Camera in the scene graph underneath the main scene Camera and leave all the frame buffer/texture attachment settings blank, this way it'll inherit the current frame buffer configuration from it's parent. Robert. On 4 November 2014 20:58, Ben Strukus ben.stru...@flyte.commailto:ben.stru...@flyte.com wrote: Hi Robert, The osgdistortion example is a great resource and I've used it quite a bit to set up the RTT and the distortion mesh, however when it comes to rendering a HUD (text, in my case) to that texture using slaves, I'm not seeing that being done in the example. Here's what I have: - A slave camera that renders the main scene to a texture. - A slave camera that renders a screen-space mesh with the above texture applied to it. - A lone camera acting as a HUD that I would like to render to the texture, but can't seem to get this working. I've tried attaching the lone camera as a child of the RTT camera, and that doesn't seem to work. I can get it to work when the RTT camera isn't a slave, but as soon as I make it a slave it doesn't work. I can get the HUD to render directly to the screen when I attach it as a child of the screen-space mesh camera, however it won't appear distorted. From: osg-users [mailto:osg-users-boun...@lists.openscenegraph.orgmailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Robert Osfield Sent: Tuesday, November 04, 2014 12:14 PM To: OpenSceneGraph Users Subject: Re: [osg-users] Slaves, RTT, and HUD Hi Ben, Have a look at the osgdistortion example as it does exactly what you describe you want to do. Robert. On 4 November 2014 19:53, Ben Strukus ben.stru...@flyte.commailto:ben.stru...@flyte.com wrote: Hello, I'm looking to get the following scenario working: A scene with a HUD rendering on a distortion mesh using slave cameras. What I've tried so far is... 1. A normal Viewer class with two Cameras in the scene graph, one on a pre-render pass that renders the main scene to a texture and one on a post-render pass that renders a screen space mesh with the other Camera's texture applied to it. 2. A Viewer class with two Cameras attached as slaves, with the same pre- and post-render setup as above. The RTT (pre-render) Camera has the main scene as the child and the mesh (post-render) Camera has just the screen space mesh as a child. For these, I can get a HUD displaying on the following Cameras when I attach it as a child to the respective Camera: - 1's pre-render Camera, attached to the texture and set up as a FBO - 1's post-render Camera - 2's post-render Camera For my HUD, I'm simply creating a Camera with an Absolute reference frame, orthogonal 2D projection, and on the post-render pass. See osghud.cpp, as I've pretty much copied that with the modification of an optional Texture parameter. The part I'm having trouble getting to work is getting that HUD rendering to a texture when that RTT Camera is a slave of a View. The thing that baffles me is I can get it working on a non-slave RTT Camera in the same way, and I can get it working on a non-RTT slave Camera, but not the intersection of both. I've tried searching for a similar solution, but I haven't found anything for this specific scenario. Any help or direction would be appreciated. :) ___ osg-users mailing list osg-users@lists.openscenegraph.orgmailto:osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org ___ osg-users mailing list osg-users@lists.openscenegraph.orgmailto:osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Aborting frames that take too long
I did some more research into what's going on and, on a hunch, changed my threading model from ThreadPerCamera (default for my PC) to SingleThreaded and ThreadPerContext. I found that setAbortRenderPtr does indeed lock up the camera thread on ThreadPerCamera, however it seems to behave 100% correctly (in not rendering anything when enabled) under the SingleThreaded and ThreadPerContext models. An easy way to reproduce it would be to add a keyboard input event handler that calls setAbortRenderPtr on the viewer state. With ThreadPerCamera, the scene should lock up. With the other models, it should just stop rendering though still remain responsive. From: osg-users [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Ben Strukus Sent: Monday, January 19, 2015 4:08 PM To: OpenSceneGraph Users Subject: [osg-users] Aborting frames that take too long Hi, I'm having a problem trying to use OSG for a specific scenario. My simulation contains several screens all showing the same scene at different angles (think projector setup). I'd like to synchronize the frames among them so that tearing between screens is minimal. Due to the nature of my scenes, some screens will have more to render than others and, as a result, take longer to render (ground vs sky). I'm looking for a way to interrupt a render loop and restart it if I detect it's taking too long for my liking. I've looked at a few proposed solutions by others with a similar problem and I've found about the osg::State::setAbortRenderingPtr(bool*). That indeed cancels the current frame, but it causes the viewer to get locked up waiting for a mutex to become available in the Renderer::ThreadSafeQueue::takeFront() function called from the Renderer::draw() function. I've tried setting the abortRenderPtr to NULL every frame (using the FRAME event on osgGA::GUIEventAdapter) and setting it to NULL immediately after it's been checked in RenderLeaf::render, though that doesn't seem to change anything. The wait function still takes control. I've read that the setAbortRenderingPtr function is old, but is there any knowledge about how it's supposed to be used? Also, if that doesn't seem like the solution for my scenario, does anyone have any suggestions? Thanks! - Ben ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Aborting frames that take too long
Hi, I'm having a problem trying to use OSG for a specific scenario. My simulation contains several screens all showing the same scene at different angles (think projector setup). I'd like to synchronize the frames among them so that tearing between screens is minimal. Due to the nature of my scenes, some screens will have more to render than others and, as a result, take longer to render (ground vs sky). I'm looking for a way to interrupt a render loop and restart it if I detect it's taking too long for my liking. I've looked at a few proposed solutions by others with a similar problem and I've found about the osg::State::setAbortRenderingPtr(bool*). That indeed cancels the current frame, but it causes the viewer to get locked up waiting for a mutex to become available in the Renderer::ThreadSafeQueue::takeFront() function called from the Renderer::draw() function. I've tried setting the abortRenderPtr to NULL every frame (using the FRAME event on osgGA::GUIEventAdapter) and setting it to NULL immediately after it's been checked in RenderLeaf::render, though that doesn't seem to change anything. The wait function still takes control. I've read that the setAbortRenderingPtr function is old, but is there any knowledge about how it's supposed to be used? Also, if that doesn't seem like the solution for my scenario, does anyone have any suggestions? Thanks! - Ben ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Slaves, RTT, and HUD
Hello, I'm looking to get the following scenario working: A scene with a HUD rendering on a distortion mesh using slave cameras. What I've tried so far is... 1. A normal Viewer class with two Cameras in the scene graph, one on a pre-render pass that renders the main scene to a texture and one on a post-render pass that renders a screen space mesh with the other Camera's texture applied to it. 2. A Viewer class with two Cameras attached as slaves, with the same pre- and post-render setup as above. The RTT (pre-render) Camera has the main scene as the child and the mesh (post-render) Camera has just the screen space mesh as a child. For these, I can get a HUD displaying on the following Cameras when I attach it as a child to the respective Camera: - 1's pre-render Camera, attached to the texture and set up as a FBO - 1's post-render Camera - 2's post-render Camera For my HUD, I'm simply creating a Camera with an Absolute reference frame, orthogonal 2D projection, and on the post-render pass. See osghud.cpp, as I've pretty much copied that with the modification of an optional Texture parameter. The part I'm having trouble getting to work is getting that HUD rendering to a texture when that RTT Camera is a slave of a View. The thing that baffles me is I can get it working on a non-slave RTT Camera in the same way, and I can get it working on a non-RTT slave Camera, but not the intersection of both. I've tried searching for a similar solution, but I haven't found anything for this specific scenario. Any help or direction would be appreciated. :) ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Slaves, RTT, and HUD
Hi Robert, The osgdistortion example is a great resource and I've used it quite a bit to set up the RTT and the distortion mesh, however when it comes to rendering a HUD (text, in my case) to that texture using slaves, I'm not seeing that being done in the example. Here's what I have: - A slave camera that renders the main scene to a texture. - A slave camera that renders a screen-space mesh with the above texture applied to it. - A lone camera acting as a HUD that I would like to render to the texture, but can't seem to get this working. I've tried attaching the lone camera as a child of the RTT camera, and that doesn't seem to work. I can get it to work when the RTT camera isn't a slave, but as soon as I make it a slave it doesn't work. I can get the HUD to render directly to the screen when I attach it as a child of the screen-space mesh camera, however it won't appear distorted. From: osg-users [mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Robert Osfield Sent: Tuesday, November 04, 2014 12:14 PM To: OpenSceneGraph Users Subject: Re: [osg-users] Slaves, RTT, and HUD Hi Ben, Have a look at the osgdistortion example as it does exactly what you describe you want to do. Robert. On 4 November 2014 19:53, Ben Strukus ben.stru...@flyte.commailto:ben.stru...@flyte.com wrote: Hello, I'm looking to get the following scenario working: A scene with a HUD rendering on a distortion mesh using slave cameras. What I've tried so far is... 1. A normal Viewer class with two Cameras in the scene graph, one on a pre-render pass that renders the main scene to a texture and one on a post-render pass that renders a screen space mesh with the other Camera's texture applied to it. 2. A Viewer class with two Cameras attached as slaves, with the same pre- and post-render setup as above. The RTT (pre-render) Camera has the main scene as the child and the mesh (post-render) Camera has just the screen space mesh as a child. For these, I can get a HUD displaying on the following Cameras when I attach it as a child to the respective Camera: - 1's pre-render Camera, attached to the texture and set up as a FBO - 1's post-render Camera - 2's post-render Camera For my HUD, I'm simply creating a Camera with an Absolute reference frame, orthogonal 2D projection, and on the post-render pass. See osghud.cpp, as I've pretty much copied that with the modification of an optional Texture parameter. The part I'm having trouble getting to work is getting that HUD rendering to a texture when that RTT Camera is a slave of a View. The thing that baffles me is I can get it working on a non-slave RTT Camera in the same way, and I can get it working on a non-RTT slave Camera, but not the intersection of both. I've tried searching for a similar solution, but I haven't found anything for this specific scenario. Any help or direction would be appreciated. :) ___ osg-users mailing list osg-users@lists.openscenegraph.orgmailto:osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org