Hi Jeremy,

Is there a term for this kind of approach? Well "impostors" comes to mind. That's where you would replace a complex mesh by a quad with a texture representing the complex mesh. Then when some criteria changes significantly (orientation of the view to the object, light source position, etc) you would re-render the texture once to fit the new criteria. This can be useful for far away meshes, for example trees where a close mesh is used, then lower and lower LODs, and eventually just a quad with a texture. Impostors are used so that the texture is still dynamic, instead of just a static billboard (which tend to look obviously like static billboards).

Hope this helps,

J-S


On 11/09/2012 10:19 AM, Jeremy Moles wrote:
On Tue, 2012-09-11 at 10:54 +0800, Wang Rui wrote:
Hi Jeremy,

Thanks for the tests and feedback. I'm focusing on creating a material
system which may be a little similar to the Ogre one but will be very
easy to integrate with OSG scenes. I'd like to also have a benchmark
including a complete deferred shading pipeline in the near future to
show others how OSG produces realistic worlds. :-)

Your requirement could be easiliy implemented with one forward pass
rendering the scene to a texture, and two deferred passes doing the
blur work with the texture as input. A final compositing pass will
make use of the outputs of the blur passes and output to a new
texture. You can get and use the new texture then in the scene for
your own purpose instead of direct displaying them on screen. I'd like
to upload a DOF effect file and an updated example some days later to
demonstrate how these work.
Are there ever cases, when doing sophisticated layering of rendering
like this, that you'd want to manually "kick off" the EffectCompositor
for just a single frame and update the texture only once? (For example,
by setting the NodeMask to 0xF for one frame, then back to 0x0 when
you're done updating the View).

Is there a term for this kind of approach, and would it make sense to
also support this model of rendering directly or should it be left up to
the user?

Thanks,

Wang Rui

2012/9/11 Jeremy Moles <cubic...@gmail.com>:
On Mon, 2012-09-10 at 22:57 +0800, Wang Rui wrote:
This looks really cool so far. I'd be really interested to know if it
supports the following (and would be willing to create examples if
you're willing to help)...

Scenario: I want to render an entire subgraph to an FBO texture once,
then apply 2 or more completely different shaders in some order, then
put the final result into a last texture to be used somewhere in the
scene. I'm doing a guassian blur, which typically applies two different
shaders for x and y.

I have this working in osgPPU, but I think you already have enough to do
it here, I just couldn't put the pieces together. :)

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



--
______________________________________________________
Jean-Sebastien Guay              jean_...@videotron.ca
                    http://whitestar02.dyndns-web.com/

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to