Re: [osg-users] Render normal vectors to image
Hi, Thank you, Glenn! This is what I need! Cheers, Han -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76905#76905 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
You're welcome. You might be able to use the osgDB::Registry::setReadFileCallback() method to intercept read calls. At that point you can generate your normals (for example run the osgUtil::SmoothingVisitor) and then return the result to the system. Glenn Waldron / osgEarth On Mon, Nov 11, 2019 at 10:05 PM Han Hu wrote: > Hi, > > Thank you, Glenn. > > Another less related question. My model is pagedlod in osgb format, which > does not contain normal vectors. How can add callback functions, to > dynamically calculate the normal vectors and attach it to the normal > buffer, when loading? > > Cheers, > Han > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=76900#76900 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
Hi, Thank you, Glenn. Another less related question. My model is pagedlod in osgb format, which does not contain normal vectors. How can add callback functions, to dynamically calculate the normal vectors and attach it to the normal buffer, when loading? Cheers, Han -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76900#76900 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
Han, Yes, you will need to use shaders. Attach your MRT (multiple-render-target) textures, like you are doing: rtt->attach(osg::Camera::BufferComponent(osg::Camera::COLOR_BUFFER0), colorTex); rtt->attach(osg::Camera::BufferComponent(osg::Camera::COLOR_BUFFER1), normalTex); rtt->attach(osg::Camera::BufferComponent(osg::Camera::COLOR_BUFFER2), depthTex); Then in your fragment shader, make an output corresponding to each buffer attachment: in vec3 normal; in vec4 color; ... layout(location=0) out vec4 gcolor; layout(location=1) out vec4 gnormal; layout(location=2) out vec4 gdepth; ... gcolor = color; gnormal = vec4((normal+1.0)/2.0, 1.0); gdepth = vec4(gl_FragCoord.z, 1.0); Notice that you will need to normalize the "normal" value so it's between [0..1]. Hope this helps! Glenn Waldron / osgEarth On Mon, Nov 11, 2019 at 4:20 AM Han Hu wrote: > Hi, > > I would like to do offscreen rendering of a mesh. > Currently, I have suceeded to do this for both the color (RGB) and depth > data using the built-in functions as below. > > > Code: > > > osg::ref_ptr rttImage = new osg::Image; > osg::ref_ptr depthImage = new osg::Image; > > camera->attach(osg::Camera::COLOR_BUFFER, rttImage.get()); > camera->attach(osg::Camera::DEPTH_BUFFER, depthImage.get()); > > rttImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_RGB, > GL_UNSIGNED_BYTE); > depthImage->allocateImage(ATInfo.width, ATInfo.height, 1, > GL_DEPTH_COMPONENT, GL_FLOAT); > > > > > But I also need to render the normal vectors to images too. I have googled > this topic using native opengl, it seems that I have to write shaders for > it. But my mesh are defined using the osgb plugin format, so I must > implement in the osg. > > I would like to know, is there an approach to do this in osg? Thanks! > > Thank you! > > Cheers, > Han > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=76892#76892 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
Hi, I think u must use shader to achieve it. u can bind regular color buffer,but using gl_FragColor.rgb = normal.xyz in fragment shader. the fallback is that you need to render the Node twice. OSG support this way. If you're familiar with OpenGL4.x,you can use GL_COLOR_ATTACHMENT_i Frame Buffer attachment,then use Multi-Render target to do this. layout(location =1) out vec4 normal; { normal = Computednormal; } But I'm not sure whether OSG support this feather. Thank you! Cheers, Yu -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76896#76896 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
Hi, Thanks! I can use osgUtil to compute the normal vectors on the fly. My question is how to render the per-vertex or per-triangles to the image, e.g. the color is the (x,y,z) of the normal vector, rather than the color of texture. Cheers, Han -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76895#76895 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Render normal vectors to image
Hi, you can use the normals pseudoloader. just append .normals to your osgb file on reading - mymodel.osgb.normals On Mon, Nov 11, 2019 at 10:20 AM Han Hu wrote: > Hi, > > I would like to do offscreen rendering of a mesh. > Currently, I have suceeded to do this for both the color (RGB) and depth > data using the built-in functions as below. > > > Code: > > > osg::ref_ptr rttImage = new osg::Image; > osg::ref_ptr depthImage = new osg::Image; > > camera->attach(osg::Camera::COLOR_BUFFER, rttImage.get()); > camera->attach(osg::Camera::DEPTH_BUFFER, depthImage.get()); > > rttImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_RGB, > GL_UNSIGNED_BYTE); > depthImage->allocateImage(ATInfo.width, ATInfo.height, 1, > GL_DEPTH_COMPONENT, GL_FLOAT); > > > > > But I also need to render the normal vectors to images too. I have googled > this topic using native opengl, it seems that I have to write shaders for > it. But my mesh are defined using the osgb plugin format, so I must > implement in the osg. > > I would like to know, is there an approach to do this in osg? Thanks! > > Thank you! > > Cheers, > Han > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=76892#76892 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > -- trajce nikolov nick ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Render normal vectors to image
Hi, I would like to do offscreen rendering of a mesh. Currently, I have suceeded to do this for both the color (RGB) and depth data using the built-in functions as below. Code: osg::ref_ptr rttImage = new osg::Image; osg::ref_ptr depthImage = new osg::Image; camera->attach(osg::Camera::COLOR_BUFFER, rttImage.get()); camera->attach(osg::Camera::DEPTH_BUFFER, depthImage.get()); rttImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_RGB, GL_UNSIGNED_BYTE); depthImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_DEPTH_COMPONENT, GL_FLOAT); But I also need to render the normal vectors to images too. I have googled this topic using native opengl, it seems that I have to write shaders for it. But my mesh are defined using the osgb plugin format, so I must implement in the osg. I would like to know, is there an approach to do this in osg? Thanks! Thank you! Cheers, Han -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76892#76892 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Render normal vectors to image
Hi, I would like to do offscreen rendering of a mesh. Currently, I have suceeded to do this for both the color (RGB) and depth data using the built-in functions as below. Code: osg::ref_ptr rttImage = new osg::Image; osg::ref_ptr depthImage = new osg::Image; camera->attach(osg::Camera::COLOR_BUFFER, rttImage.get()); camera->attach(osg::Camera::DEPTH_BUFFER, depthImage.get()); rttImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_RGB, GL_UNSIGNED_BYTE); depthImage->allocateImage(ATInfo.width, ATInfo.height, 1, GL_DEPTH_COMPONENT, GL_FLOAT); But I also need to render the normal vectors to images too. I have googled this topic using native opengl, it seems that I have to write shaders for it. But my mesh are defined using the osgb plugin format, so I must implement in the osg. I would like to know, is there an approach to do this in osg? Thanks! Thank you! Cheers, Han -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=76893#76893 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org