Re: [osg-users] osg::Image with signed int
Hi Pau, Pau Moreno wrote: Hi, I've create an int matrix[16][256] and I need to pass it to the GPU. So what I've done is create a osg::Image, but when I have to pass the data to setImage, I have to do a cast to my data, because it needs an unsigned char *. The unsigned char * is just to get the start of your data, it won't throw away information of your data. E.g. we are passing floats and still just point this parameter at the start of the data. My matrix contains signed ints so I cannot cast it to a unsigned char, as I loose information. How can I do it? The problem is not the unsigned char *, but the pixel format of the texture you are passing the data into. You will have to find an int pixel format that supports signed (I'm not sure one exists) or you will have to switch to 16/32 bit float pixel formats on the GPU. The osgprerender has a float path that can give you some hints. How are you going to process the data? Do you want to do rendering or GPGPU work? regards jp Thanks! Cheers, Pau -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=15932#15932 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org -- This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html. This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. MailScanner thanks Transtec Computers for their support. ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] osg::Image with signed int
Hi J.P., Finally I can figure out a way so I don't need the signed int. Anyway, I think my really problem is that by some reason, the texture is not uploading correctly to the GPU :S In my OGL code I have: Code: glGenTextures(1, (this-triTableTex)); glActiveTexture(GL_TEXTURE2); glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, this-triTableTex); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE); glTexImage2D( GL_TEXTURE_2D, 0, GL_ALPHA16I_EXT, 16, 256, 0, GL_ALPHA_INTEGER_EXT, GL_INT, triTable); and in osg I do: Code: osg::Texture2D *tex = new osg::Texture2D(); tex-setFilter( osg::Texture2D::MIN_FILTER, osg::Texture2D::NEAREST ); tex-setFilter( osg::Texture2D::MAG_FILTER, osg::Texture2D::NEAREST ); tex-setWrap( osg::Texture2D::WRAP_S , osg::Texture2D::CLAMP_TO_EDGE ); tex-setWrap( osg::Texture2D::WRAP_T , osg::Texture2D::CLAMP_TO_EDGE ); tex-setWrap( osg::Texture2D::WRAP_R , osg::Texture2D::CLAMP_TO_EDGE ); //tex-setDataVariance( osg::Object::DYNAMIC ); unsigned char *cc = new unsigned char[256*16]; for ( int i = 0 ; i 256 ; i++) for ( int ii = 0 ; ii 16 ; ii++) { //cc[ ii + i*16 ] = triTable[i][ii]; cc[ ii + i*16 ] = 127; } osg::Image *i = new osg::Image(); i-setImage( 16 , 256 , 0 , GL_ALPHA16I_EXT , GL_ALPHA_INTEGER_EXT , GL_INT , (unsigned char *)triTable , osg::Image::NO_DELETE ); tex-setImage( i ); brickState-setTextureAttributeAndModes(1,tex,osg::StateAttribute::ON); I'm just setting all the values to 127 so I can check in the GPU if the texture is uploaded to the geometry shader. I need this values to work in the GPU, as I'm using it for an Marching Cubes Algorithm based. Do you think I need to do something else? Thanks! Cheers, Pau[/code] -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=15980#15980 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] osg::Image with signed int
Hi, I'm sorry about that, but the texture upload it wasn't the problem. That's the problem working in the GPU, something is wrong but you don't have a way to determine what's wrong :P The problem was that in the GPU i had a vec3 array of 8 values, and I was setting it like Code: osg::Uniform* vtxDecal0U = new osg::Uniform(vertDecals[0], osg::Vec3f( 0.0f , 0.0f , 0.0f ) ); ... which goes ok in OGL, but seems to not work ok with osg, the value was not set. I work now with 8 diferents variables, but if anyone know how to solve it, it would be great :) Thank you! Cheers, Pau -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=15993#15993 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] osg::Image with signed int
Hi, I've create an int matrix[16][256] and I need to pass it to the GPU. So what I've done is create a osg::Image, but when I have to pass the data to setImage, I have to do a cast to my data, because it needs an unsigned char *. My matrix contains signed ints so I cannot cast it to a unsigned char, as I loose information. How can I do it? Thanks! Cheers, Pau -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=15932#15932 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org