[osg-users] Qt widget scene shows up black

2012-03-17 Thread James Cotton
The demo application osgviewerQt shows the grid of images as black,
but the dump truck works just fine.  It's not a problem finding the
file because trying to add the dumptruck to the layout instead of one
of the others makes it disappear.

I ran my app with OSG_NOTIFY_LEVEL=debug and don't see anything
apparent in the file between the condition where I have a single view
popup (works) versus embedded into a widget (black screen).

This is using qt 4.8.1 and osg master from github, OSX 10.7.3.  I've
tried compiling it various ways (+/- framework, qt for openthreads) to
no avail.  Changing threading models with the application switches
didn't help either.  Oh, and it's not related to window decoration as
without that the popup still works.

Any suggestions on how I can proceed debugging this?

Thanks,
James
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Problems rendering to and capturing/readin GL_RGBA16

2012-03-17 Thread Sergey Polischuk
Hi,  iirc to use GL_RGB16I_EXT, you should set source format to GL_RGB_INTEGER_EXTalso this is unnormalized texture format, so afaik you should write to it corresponding values (like int(INT_MAX*value)) where value is original value in range 0..1 Also i've used 32float per channel images to read back data without any problems, but i didnt used textures with them, attaching osg::Image directly to camera buffer. You can try it this way, but it slow as hell (may be assigning pbo to image will speed up things, i didnt tried it as in my case great runtime speed wasnt needed). 17.03.2012, 00:22, "Chris Hanson" :  A specialized tool I'm working on needs to have a 16-bit Alpha channel for post-render analysis. I render to a screen-sized texture (1920x1080 NPOT) and then read it back to the CPU side and assess the values.  I originally developed it with 8-bit per gun RGBA (32-bit total) where it works fine, but precision is very poor. I then switched my RTT camera destination texture to use GL_RGBA16:RTTtex->setInternalFormat( GL_RGBA16 );RTTtex->setSourceFormat(GL_RGBA);RTTtex->setSourceType(GL_UNSIGNED_SHORT);  This seems to work lovely, but now reading back to the texture and fetching pixels from the texture seems problematic.  I capture with a post-draw callback (m_tex is the RTT texture):void CameraImageCaptureCallback::operator()( osg::RenderInfo& ri ) const{    osg::Image* image = m_tex->getImage();    // Get the texture image.    m_tex->apply( *( ri.getState() ) );    image->readImageFromCurrentTexture( ri.getContextID(), false, GL_UNSIGNED_SHORT );}  I _think_ the UNSIGNED_SHORT is appropriate here (it was UNSIGNED BYTE when I was using regular 8-bit-per-channel rendering).  To run this callback, I:  osg::ref_ptr< CameraImageCaptureCallback > captureCB;  outputImage = new osg::Image;  outputImage->setPixelFormat(GL_RGBA);  outputImage->setDataType(GL_UNSIGNED_SHORT); // different when in 8-bit mode  osg::Texture2D *accumTex = GetAccumulationTexture(); // this is the texture that rendering writes to  accumTex->setImage( outputImage );           captureCB = new CameraImageCaptureCallback(accumTex);   GetPrerenderCamera()->setPostDrawCallback( captureCB.get() );  // Run the viewer one more time to capture to the attached image.  vpEnv.viewer->frame();  Later, I examine the image pixel by pixel with:alpha = lessonOutput->getColor(xLoop, yLoop).a();  All of this works dandy in 8-bit mode, but add 8 more bits and it goes to heck.  This message:http://forum.openscenegraph.org/viewtopic.php?t=9879  leads me to believe that some parts of OSG may not be supporting GL_RGBA16 properly, and in fact, I don't see it listed in the tokens in Image::computeNumComponents() though less-well-known tokens like GL_RGB16I_EXT are there.  I tried using GL_RGB16I_EXT as my RTT texture format, but OSG fails to setup the RTT FBO, so I didn't succeed with that avenue. I was able to use GL_RGB16F_ARB but it seemed to behave very oddly all around, often refusing to clear the buffer when told to, so I abandoned that.  A I missing any steps to setup for a GL_RGBA16 texture? I got very confused around all of the formats and internal formats and knowing what I needed to preconfigure in the osg::Image versus what OSG was going to setup for me. After the readImageFromCurrentTexture call, the Image's pixelFormat is set to GL_RGBA16, which makes calls like getColor fail because GL_RGBA16 isn't recognized.  Is there a better way to render to a 16-bit int format? Am I just missing a critical step?-- Chris 'Xenon' Hanson, omo sanza lettere. xe...@alphapixel.com http://www.alphapixel.com/Training • Consulting • Contracting3D • Scene Graphs (Open Scene Graph/OSG) • OpenGL 2 • OpenGL 3 • OpenGL 4 • GLSL • OpenGL ES 1 • OpenGL ES 2 • OpenCLDigital Imaging • GIS • GPS • Telemetry • Cryptography • Digital Audio • LIDAR • Kinect • Embedded • Mobile • iPhone/iPad/iOS • Android___osg-users mailing listosg-users@lists.openscenegraph.orghttp://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Texture repeats and overlaps when I don't want it to

2012-03-17 Thread Sergey Polischuk
It can be problem with image format (like osg plugin fails to read 16bit per 
channel image) or something along these lines

17.03.2012, 01:06, "Zachary Hilbun" :
> Hi,
>
> This turned out to be a problem with the image itself rather than my code.  
> On most PNG viewers, it looks like a normal image.  When I viewed it using 
> Internet Explorer, it overlaps itself the same way it does on my texturing.
>
> I got this image from my client so I don't know how it was created.  It could 
> be that it is an animated PNG and the overlap is what you see if you try to 
> view it in a viewer that doesn't support  animated PNG.  I'll talk to them 
> and see what the problem is.
>
> If it's an animated PNG, I may be looking for an editor to allow me to save 
> it as a single PNG image.
>
> Thank you!
>
> Cheers,
> Zachary
>
> --
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=46374#46374
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] GL_STENCIL_TEST in draw implementation callback

2012-03-17 Thread Sergey Polischuk
afaik there are implementation dependent restrictions on viewport size, you may 
hitting those limits

16.03.2012, 19:14, "Doug Domke" :
> UGH!  I just found that I was mistaken about the video growing properly 
> without the stencil.  It actually doesn't.  Once its viewport exceeds a 
> certain size, it no longer renders as expected.  I think I've come as far as 
> I can here.
>
> Thanks anyway.
> Doug
>
> --
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=46353#46353
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Maintaining a reference to an osg node outside of osg

2012-03-17 Thread Robert Osfield
Hi Preet,

On 16 March 2012 19:53, Preet  wrote:
> Typically I use osg::ref_ptr<> to let osg handle reference counting.
> Right now I'm trying to create a 'rendering engine' for another
> library with osg. The idea for the other library is to maintain an API
> that allows different rendering engines -- osg, ogre, vtk, etc.

Others have amply answered the ref counting question so I won't add
anything further than this.

I would like to question the idea of adding an rendering API
abstraction layer.  VTK is a very different beast to OSG or OGRE, and
the later two while have more in common which each other than they do
with the approach of VTK are still very different in design and
implementation.  This will mean the abstraction layer would have to do
a lot of work just to try and fit them all into the same box.  In
doing this fitting you'll almost certainly make it harder to do any
advance usage of the underlying API that you are actively using at any
time.  What you are gunning for is hard to design, hard to implement
and hard to maintain.

The next question I'd like to pose to what do you gain for this
abstraction layer?  The Pain is high with the approach you want to
take, so the benefit will have to be huge

Personally I can't think of a good reason to justify all the pain and
hampering of final functionality.  I would recommend not going to the
lengths of trying to abstract away from the API's.  Work out what you
need to do with your application, work out what you need from your
rendering API and match this up against the candidates in front of
you.  All of the candidates have been around for a good period and are
mature SDK's each with their own strengths and weakness, but one thing
for sure is that they have lived long enough and have big enough
communities to know that they will be around for a good while longer
and there all open source, so there isn't any reason to hedge your
bets on which one will be around in 5 or 10 years time, there is a
good chance they all will be.  So work out which one fits best and
pick it and discard the rest.

This approach will be the most productive in the short term, mid term
and long term.  Software projects are typically hard enough as it is
without putting extra roadblocks of complexity in your way.

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame Rate "Decay" w/ SilverLining Integration

2012-03-17 Thread Wojciech Lewandowski
Hi Brad,

Thank you for the report. I personally investigated and reported another
issue with VBOs on Nvidia drivers few months ago. And that issue was fixed
in 290 driver series. It may be a long shot but I am just curious if these
two problems could be related. While investigating my issue (
http://forum.openscenegraph.org/viewtopic.php?t=9258&postdays=0&postorder=asc&start=0).
I got following post from Ruben Smelik:

"[..]
This mail reminded me of an issue I had a couple of years ago with VBO's on
a particular Windows pc with a 9800GX2. I thought it was an issue of that
PC, as it was quite unstable, so I didn't report the problem at that time.
The solution I accidently found back then was to turn off Threaded
Optimization in the NVidia Control Panel (Auto per default).

But now I'm getting the bad result of your test on a GTX 480 (266.58
driver), and that "fix" works again. After turning off Threaded
Optimization, I see the proper gradient displayed.

Could you try this as well?
[..]"


Your drivers are 276.21 so pretty close to 266.58 Ruben used. So I am now
also curious if you could try to turn off Threaded Optimization and/or try
newer drivers and see if the problem still exsists.

Cheers,
Wojtek


2012/3/16 Christiansen, Brad 

> Hi Woktej,
>
> ** **
>
> Thanks for you offer to help out, but I have managed to track it down
> enough to have a good enough solution for now. 
>
> For anyone else who stumbles across this issue,  my work around is to
> disable VBOs in silverlining. If I did this by using  the
> SILVERLINING_NO_VBO environment variable it crashed so I simply hard coded
> them to off in SilverLiningOpenGL.cpp. I narrowed down the source of the
> issue to calls to AllocateVertBuffer in the same file.  Even if the buffers
> are never used, simply allocating them for the 6 faces of the sky box is
> enough to cause things to go wrong.
>
> ** **
>
> I am using version 2.35 of SilverLining.
>
> VS2010 SP1
>
> OSG trunk as of a month or two ago
>
> Windows 7
>
> Nvidia GTX460M Driver Version 267.21
>
> ** **
>
> The same problem was also occurring on another machine. I think that had a
> 450GT in it, but otherwise the same.
>
> ** **
>
> Cheers,
>
> ** **
>
> Brad
>
> ** **
>
> *From:* osg-users-boun...@lists.openscenegraph.org [mailto:
> osg-users-boun...@lists.openscenegraph.org] *On Behalf Of *Wojciech
> Lewandowski
> *Sent:* Saturday, 17 March 2012 1:59 AM
>
> *To:* OpenSceneGraph Users
> *Subject:* Re: [osg-users] Frame Rate "Decay" w/ SilverLining Integration*
> ***
>
> ** **
>
> Hi, Brad,
>
> ** **
>
> We have SilverLining source code license. I may find few hours in next
> week to look at the problem, if the issue can be reproduced on one of my
> machines (ie Nvidia GF580/GF9400 or GF540M). I would like to have as much
> info as possible to replicate the issue, though. I would like to know:
>
> ** **
>
> - System version
>
> - OSG version
>
> - Graphics board and driver version (dual monitor setup ? GPU panel tweaks)
> 
>
> - Compiler/linker VS studio version
>
> - SilverLining version. If not yet tested I would be grateful if you could
> test it with latest trial SilverLining SDK to be sure its not fixed already.
> 
>
> ** **
>
> What exactly is done with SilverLining ? What cloud types / wind settings
> / lighnting etc are used ?. Each type of SilverLining Cloud entities  has
> its own specific parameters and can be drawn with different algorithm and
> use differen graphics resources. So it may be important to know what
> SilverLining resourses are in use. Probably the best would be if you could
> send the sample source you are testing.  
>
> ** **
>
> Cheers,
>
> Wojtek Lewandowski
>
> ** **
>
> ** **
>
> 2012/3/16 Christiansen, Brad 
>
> Hi,
>
>  
>
> Thanks for the response. I have a little more details of the problem but
> am still completely stumped.
>
>  
>
> This is my test:
>
> Start my application and leave it running for a while.  Frame rate, memory
> use etc all stable.
>
> Enable silverlinng.
>
> As reported by gDebugger, after the initial expected increase,  the number
> of reported OpenGL calls, vertices, texture objects (and every other
> counter they have)
>
> stays completely stable expect for the frame rate which reduces at a
> steady rate, a few frames each second.
>
>  
>
> In the earlier thread, it was noted that changing the threading model
> seemed to ;reset' the frame rate. I looked into this some more and it seems
> the behaviour is 'reset' when the draw thread is recreated started. If you
> return back to a thread that had previously 'decayed', it continues to
> decay from where it left off.
>
> e.g. singlthreaded decays to 50fps
>
> switch to draw thread per context and frame rate goes back to 100fps when
> a new thread is created and it starts decaying again
>
> switch back to single threaded (no new threads are created) a

Re: [osg-users] Frame Rate "Decay" w/ SilverLining Integration

2012-03-17 Thread Frank Kane
Hi folks, sorry I didn't chime in earlier - was on the road.

Like Wojtek said, this doesn't seem to be reproducible when using newer NVidia 
drivers, and it's not an issue that we hear about often. Good driver support of 
vertex buffer objects used to be pretty spotty, which is why we introduced the 
SILVERLINING_NO_VBO environment variable to bypass them. 
SILVERLINING_NO_BINDLESS exists for the same reason (for bindless graphics on 
NVidia.)

However, you're right in that using SILVERLINING_NO_VBO does lead to a crash in 
the current SDK. I've fixed that for the upcoming SilverLining 2.36 release 
(all it is is changing the SL_FREE(buf) in line 238 of SilverLiningOpenGL.cpp 
to free(buf) ).

I'll add a blurb in the troubleshooting section of our docs about what to do if 
you experience mysterious framerate degradation as well.

Thanks for using our products,

Frank Kane
Founder
Sundog Software LLC

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=46386#46386





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Application searches for ot12-OpenThreads.dll instead of the debug

2012-03-17 Thread Stefanos Kougiou
Hi,

I have been trying to build an app relying on OSG 3.0, OSGart 2rc3 and VS 2008.
I am not an expert on MS VS 2008 so maybe I have missed something while 
configuring. 

I have built everything in both debug configuration and release.

the problem is that when starting the debugger I get a message of 
"ot12-OpenThreads.dll is missing..."  and I have only included debug versions 
as dependencies.

The message should be about "ot12-OpenThreadsd.dll" which is my debug dll

I have included the following in my project's "linker->additional dependencies"

kernel32.lib
user32.lib
gdi32.lib
winspool.lib
shell32.lib
ole32.lib
oleaut32.lib
uuid.lib
comdlg32.lib
advapi32.lib
"C:\osgART_debug.lib"
"C:\lib\osgd.lib"
"C:\lib\osgDBd.lib"
"C:\lib\osgGAd.lib"
"C:\lib\osgViewerd.lib"
"C:\lib\osgWidgetd.lib"
glu32.lib
opengl32.lib

The osgART_debug.lib is produced by the osgART project,
which has the following "additional dependencies" in project options

kernel32.lib
user32.lib
gdi32.lib
winspool.lib
shell32.lib
ole32.lib
oleaut32.lib
uuid.lib
comdlg32.lib
advapi32.lib
"C:\lib\OpenThreadsd.lib"
"C:\lib\osgd.lib"
"C:\lib\osgDBd.lib"
glu32.lib
opengl32.lib

I can't figure out why my project asks for "ot12-OpenThreads.dll"
I also checked the dependenies of the projects that produce these libraries
osgd.lib
osgDBd.lib
osgGAd.lib
osgViewerd.lib
osgWidgetd.lib

none of them depends on OpenThreads.lib
Any comment would be really appreciated!
Thanks!

Stefanos

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=46387#46387





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org