Hi, I'm working on a shader using the new Shader Model 3 standard to take advantage of Vertex Texture (access to Textures in Vertex Program).
I'm using NVIDIA Specs taken from http://download.nvidia.com/developer/Papers/2004/Vertex_Textures/Vertex_Textures.pdf In OpenSG I use this image type (accordingly to the specs): texture[i]->setScale(false); texture[i]->setTarget(GL_TEXTURE_2D); texture[i]->setInternalFormat(GL_RGBA_FLOAT32_ATI); texture[i]->setMinFilter( GL_NEAREST ); // GL_LINEAR texture[i]->setMagFilter( GL_NEAREST ); // GL_LINEAR texture[i]->setWrapS( GL_CLAMP_TO_EDGE); texture[i]->setWrapT( GL_CLAMP_TO_EDGE); ad my sphere gets rendered OK. Now I'm starting using texture images which are not power of 2 and so I changed texture[i]->setScale(false); to texture[i]->setScale(true); and strange behaviours happens: the sphere gets rendered like the RGBA values were scrambled. (see link to images below) Does anybody have an idea of what happens? I think it isn't a problem of OpenSG, but help to point into the right direction would be appreciated. Here some useful info: ++ I'm using RGBA images in floating point (32bit) ++ working on Nvidia graphic card with Shader Model 3 capabilities ++ image showing what I mean: http://www.blending-life.org/images/test.jpg Thanks, -- Josef Grunig www.blending-life.org ------------------------------------------------------- This SF.net email is sponsored by: Splunk Inc. Do you grep through log files for problems? Stop! Download the new AJAX search engine that makes searching your log files as easy as surfing the web. DOWNLOAD SPLUNK! http://ads.osdn.com/?ad_idv37&alloc_id865&op=click _______________________________________________ Opensg-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/opensg-users
