Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Jan Ciger
On Thu, Apr 16, 2015 at 7:43 PM, Björn Blissing bjorn.bliss...@vti.se
wrote:


 That does not seem entirely correct, if you look at the values for running
 without Vsync I have managed to get down to 4 ms and a mean of 14 ms. So I
 guess that my screen has a scan out time of ~4ms and since I am rendering
 at ~3000 fps without Vsync I have somehow managed to send a frame just as
 the screen starts to scan out a new image.


That actually sounds odd, because the monitor will not refresh the image
faster than its fixed refresh rate. 4ms would require 250Hz refresh, I am
not aware of any commonly sold LCD that could go that fast. Even 120Hz ones
are quite rare. Are you sure it is not an artefact of your measurement
method? I.e. that you start your timer circuit from the PC when you send a
new image but the light sensor is still registering the light from the
previous frame, so it triggers right away, giving you a false low reading,
essentially showing only how long it took the GPU to send out the frame. If
you aren't doing so already, you may have to implement the light trigger
only when it has seen dark before light to make sure that it is not
triggering on old data. An alternative method could be a simple R-C high
pass filter circuit that will make it generate a pulse to stop your timer
only on the transition (from dark to light or vice versa) and ignore the
steady level.

J.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Björn Blissing
Hi again,

I also tried Roberts suggestion about using the trunk and 
OSG_SYNC_SWAP_BUFFERS=ON.

To get better averages I took the measurements over 100 cycles:

Recorded data:
=
OSG 3.2.1 + Nvidia Defaults + VSync On:
min_latency =  59ms
max_latency =  65ms
mean_latency =  62ms

OSG Trunk +  Nvidia Defaults + VSync On + SYNC_SWAP_BUFFERS=OFF: 
min_latency =  59ms
max_latency =  64ms
mean_latency =  62ms

OSG Trunk + Nvidia Defaults + VSync On + SYNC_SWAP_BUFFERS=ON: 
min_latency =  42ms
max_latency =  48ms
mean_latency =  45ms

OSG Trunk + Custom Settings as previous + VSync On:
min_latency =  10ms
max_latency =  40ms
mean_latency =  21ms

Regards,
Björn

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63450#63450





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Robert Osfield
Hi Björn,

What happens when using use Custom NVidia settings and SYNC_SWAP_BUFFERS=ON ?

Robert.

On 17 April 2015 at 13:58, Björn Blissing bjorn.bliss...@vti.se wrote:
 Hi again,

 I also tried Roberts suggestion about using the trunk and 
 OSG_SYNC_SWAP_BUFFERS=ON.

 To get better averages I took the measurements over 100 cycles:

 Recorded data:
 =
 OSG 3.2.1 + Nvidia Defaults + VSync On:
 min_latency =  59ms
 max_latency =  65ms
 mean_latency =  62ms

 OSG Trunk +  Nvidia Defaults + VSync On + SYNC_SWAP_BUFFERS=OFF:
 min_latency =  59ms
 max_latency =  64ms
 mean_latency =  62ms

 OSG Trunk + Nvidia Defaults + VSync On + SYNC_SWAP_BUFFERS=ON:
 min_latency =  42ms
 max_latency =  48ms
 mean_latency =  45ms

 OSG Trunk + Custom Settings as previous + VSync On:
 min_latency =  10ms
 max_latency =  40ms
 mean_latency =  21ms

 Regards,
 Björn

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=63450#63450





 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StatsHandler missing numbers

2015-04-17 Thread Christian Buchner
One major problem with the Intel drivers is that their products are
typically EOL'ed (end-of-lifed) three years after first release - at which
point they stop receiving driver updates (not even bug fixes).

You can see the problem with the integrated graphics in Intel's Sandy
Bridge chips (introduced Q1 2011). You cannot get any OpenGL drivers dated
newer than April 2014 - and some OpenGL features we require simply aren't
working with this driver release.

And so we have to tell some of our customers that their laptops have use
Ivy Bridge or later microarchitecture - or be equipped with a separate
dedicated graphics chip.

Christian


2015-04-17 15:23 GMT+02:00 Andreas Schreiber a...@online.de:

 Hi Émeric,

 thx for the respond and the idea ;)

 My driver version is:
 Intel(R) HD Graphics 4000

 Driver Version: 10.18.10.3958
 Operating System: Windows* 8.1 (6.3.9600)
 Installed DirectX* Version: 11.2
 Supported DirectX* Version: 11.0
 Shader Version: 5.0
 OpenGL* Version: 4.0
 OpenCL* Version: 1.2
 Physical Memory: 8036 MB
 Processor: Intel(R) Core(TM) i5-3320M CPU @ 2.60GHz
 Processor Speed: 2594 MHz
 Vendor ID: 8086
 Device ID: 0166
 Device Revision: 09

 * Processor Graphics Information *
 Processor Graphics in Use: Intel(R) HD Graphics 4000
 Video BIOS: 0.151
 Current Resolution: 1600 x 900


 I could not update it, yet. But when I do and it helps I post it ;)

 ...

 Thank you!

 Cheers,
 Andreas

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=63453#63453





 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StatsHandler missing numbers

2015-04-17 Thread Émeric MASCHINO
Hi Andreas,

What release of Intel drivers are you running?

I recently experienced issues with an application on various Dell
notebooks and Intel NUC barebones (all HD4000-based) where texts
printed with glRasterPos3 were not displayed at all with driver
release 8.x. Upgrading to release 9.x (from Dell and Intel websites)
solved the issue. I didn't found anything clear w.r.t. this problem in
the Intel drivers changelogs though.

It's noteworthy that the application I was referring to isn't
OSG-related (plain OpenGL) and was running on Windows 7 (not 8.1), but
who knows!

 Émeric

2015-04-16 19:56 GMT+02:00 Andreas Schreiber a...@online.de:
 Hi,

 same thing happens.

 If it starts in fullscreen the cow modell seems broken. Its part wise 
 transparent...
 If I try making a screenshot, the screenshot is totally fine.
 You must think that I'm crazy...

 If I press s for the stats in fullscreen the cow modell is shown normally 
 =).

 In fullscreens they are drawn in a bright gray color.
 In window mode you only see the zeros, the others are missing or to bright.

 I must have the newest drivers, but I look at them.
 ...

 Thank you!

 Cheers,
 Andreas

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=63444#63444




 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] makeLookAt and Oculus rift quaternion question

2015-04-17 Thread João
I have 3D scene set up in osg, and I'm trying to use oculus rift to view the 
scene. What I'm trying to do is calibrating an initial position using 
makeLookAt and grabbing the rotation in order to focus the camera on a specific 
spot of the scene; 
The problem arises when I then attempt to stack the oculus rotation quaternion 
on top of the makeLookAt rotation, the math eludes me and I cannot seem to get 
it working correctly.   ___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StatsHandler missing numbers

2015-04-17 Thread Andreas Schreiber
Hi Émeric,

thx for the respond and the idea ;)

My driver version is: 
Intel(R) HD Graphics 4000

Driver Version: 10.18.10.3958
Operating System: Windows* 8.1 (6.3.9600)
Installed DirectX* Version: 11.2
Supported DirectX* Version: 11.0
Shader Version: 5.0
OpenGL* Version: 4.0
OpenCL* Version: 1.2
Physical Memory: 8036 MB
Processor: Intel(R) Core(TM) i5-3320M CPU @ 2.60GHz
Processor Speed: 2594 MHz
Vendor ID: 8086
Device ID: 0166
Device Revision: 09

* Processor Graphics Information *
Processor Graphics in Use: Intel(R) HD Graphics 4000
Video BIOS: 0.151
Current Resolution: 1600 x 900


I could not update it, yet. But when I do and it helps I post it ;)

... 

Thank you!

Cheers,
Andreas

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63453#63453





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Björn Blissing

robertosfield wrote:
 Hi Björn,
 
 What happens when using use Custom NVidia settings and SYNC_SWAP_BUFFERS=ON ?
 
 Robert.
 


Hi Robert,

The latency values for SYNC_SWAP_BUFFERS=ON and OFF where exactly the same for 
the custom settings.

Regards
Björn

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63455#63455





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Android osgPlugins

2015-04-17 Thread Christian Kehl
Jan Ciger jan.ciger@... writes:

 
 
 
 On Thu, Apr 16, 2015 at 3:33 PM, Christian Kehl Christian.Kehl at
uni.no wrote:
 How do I make sure it uses the .a-files (static libraries) ? 
 
 Check whether you have any .so files generated for OSG. You shouldn't have
any, only .a libraries. 
 
 J.
 
 
 
 
 
 

checked - only .a files available. No shared objects/dynamic libraries
build, all static.




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Björn Blissing

Jan Ciger wrote:
 
 That actually sounds odd, because the monitor will not refresh the image 
 faster than its fixed refresh rate. 4ms would require 250Hz refresh, I am not 
 aware of any commonly sold LCD that could go that fast. Even 120Hz ones are 
 quite rare. Are you sure it is not an artefact of your measurement method? 


Well, I can certainly not give any total guarantees. But output data looks 
relatively sane. I am cycling the quad color at 1 Hz and sampling the light 
sensor at 1Hz. 

Looking at one cycle:
http://i.imgur.com/mPZ70Uw.png

The blue line denotes the color of the quad (0 black - 1 white).
The green line denotes the analog output from the light sensor. I measure the 
time difference between the change in the two signal.

I ran a measurement cycle with 100 cycles and plotted the latency times:
http://i.imgur.com/NfSzeIx.png

First of all it seems that time from white to black is consistently faster than 
black to white.

Looking at this plot it seems like the latency times are varying according to a 
pattern. My guess is that the screen runs asynchronously with the GPU. So the 
screen polls the GPU at 60Hz and sometimes the GPU just happens to have a frame 
ready when the screen starts a scanout. The lower limit is just the pure 
scanout time of the display. But this is just my theory right now. I do not 
have the detailed knowledge of the inner workings of a LCD display.

Regards
Björn

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63449#63449





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Jan Ciger
On Fri, Apr 17, 2015 at 1:57 PM, Björn Blissing bjorn.bliss...@vti.se
wrote:

 Looking at this plot it seems like the latency times are varying according
 to a pattern. My guess is that the screen runs asynchronously with the GPU.


That it certainly does. Modern LCDs are not tied to the VSYNC/HSYNC and
pixel clock signals anymore as the old CRTs were, because there is always
an image processing logic inbetween. If for nothing else then for scaling
the image to the full resolution of the LCD panel. With packet oriented
digital connections like DisplayPort it gets decoupled even further
(DVI/HDMI is still essentially emulating the old VGA which in turns
emulates the old TV conventions for video, even though the way the signals
are encoded electrically is very different). Oh and it gets better - in
fact, some monitors completely ignore VSYNC/HSYNC signals when connected by
a digital connection (HDMI/DisplayPort), because they know how many pixels
they need to draw, so they simply count the incoming bytes, something that
was not possible with the purely analog VGA signals and analog monitors.

There are basically 4 completely asynchronous systems here:

- your code
- the GPU
- the monitor processor
- the LCD panel itself

The first two can be synchronized using VSYNC and/or fences, the second two
using the HDMI/DVI/VGA electrical signals, the monitor processor is
synchronized with the LCD panel, but there is no mutual synchronisation
between these pairs, there are, in fact, buffers between them to account
for the differing processing speeds. And you can typically only observe the
first and the last one ...

So the screen polls the GPU at 60Hz and sometimes the GPU just happens to
 have a frame ready when the screen starts a scanout.

The lower limit is just the pure scanout time of the display. But this is
 just my theory right now. I do not have the detailed knowledge of the inner
 workings of a LCD display.


The way it works is the opposite - the GPU generates the data at whatever
resolution and refresh rate the screen declares that it supports
(determined via EDID, unless manually overriden in the driver settings) and
sends these to the processor in the display. The display then does any
processing it needs and only actually flips the pixels on the panel when
ready, independently from the GPU. Which is always delayed a bit -
depending on how much processing the display does. There is no polling, the
video connection is essentially one way only (not counting service data
like EDID).

In comparison, the old CRTs had low latency, because the analog signal from
the GPU was driving the deflection coils steering the electron beam
practically directly, with no buffering or processing. When the GPU started
a new frame by sending VSYNC, the monitor really made the beam jump to the
upper left corner at that moment. That also explains why you could generate
weird, unsupported resolutions out of an analog CRT screen and why you
could potentially fry it with resolution or refresh too high for it to
handle - the deflection coild electronics would typically overheat, drawing
too much current.

I think the problem you are seeing with the jitter that sometimes you
have very low latency and sometimes it is over a frame comes from a
different source - namely your program and the VSYNC handling by the GPU.
The GPU will always generate the output signal the same way, VSYNC or no
VSYNC, otherwise the monitor may not be able to handle it and sync to it.
What happens is that sometimes your program gets lucky and tells the GPU
to swap buffers just in time before the start of the next frame - then
you have very little latency, because the change gets visible almost
immediately (modulo the input latency of the monitor above). On the other
hand, sometimes you get unlucky, you swap buffers right after the scanout
of the framebuffer has started and then the GPU will hold your image until
the next frame cycle - poof, one frame of latency extra ... And you can
have everything in between these two extreme cases.

When VSYNC is on, it gets even more complicated, because then you are
telling the GPU to synchronize the userspace code with the frame scanout
start (not the start of the physical frame on the monitor when your sensor
reacts - remember, the GPU has no control at all over the image processor
in the monitor!). This is typically an extremely inefficient thing to do
from the driver's point of view because you are stalling the GPU until the
new frame is due, so the drivers often play games here - like not really
blocking your program waiting for the frame start but return right away and
buffer your frame internally. The frame then gets sent out later when
convenient (i.e. on the next scanout cycle). They can even hold several
frames back like that and block only when this frame queue runs out of
space (you were really rendering too fast). Especially Nvidia is known for
these VSYNC shenanigans in their driver. This is what Robert was 

Re: [osg-users] Android osgPlugins

2015-04-17 Thread Jan Ciger
On Fri, Apr 17, 2015 at 1:26 PM, Christian Kehl christian.k...@uni.no
wrote:


 checked - only .a files available. No shared objects/dynamic libraries
 build, all static.


Then I am really at the end with ideas :( Perhaps Jordi will know what
could be going on there.

J.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Jan Ciger
On Fri, Apr 17, 2015 at 6:10 PM, Björn Blissing bjorn.bliss...@vti.se
wrote:


 But my main point still stands; it is possible to record latencies that
 are close to the scanout time of the screen with VSync Off (albeit for very
 simple rendering).


Yes, but not consistently for every frame. Assuming that each frame you are
drawing takes the same time, you need to get lucky for that to happen and
then the next frame is going to have a much higher latency. When you do get
lucky, the next frame buffer swap will happen slightly earlier now relative
to the start of the scanout, because you didn't start drawing after the
vsync event but earlier. This will keep going on for several frames,
shifting the buffer swap point earlier and earlier relative to the scanout
cycle until you, again, hit that sweet spot where you measure the low
latency. Your measurement shows this beating between the scanout/display
update frequency and your buffer swaps (when you are starting the timer, I
presume) quite nicely.

The best you can do from the latency point of view is to synchronize to the
scanout using the fences. Then you will have always at most one frame
latency (the GPU is scanning out the previous frame from the front buffer
while you are drawing the current one in the back buffer) if you can fit
your drawing calls within the duration of a single frame.

Don't get confused by the late latching tricks that Oculus is promoting.
That has nothing to do with display latency but user input/tracking one.
There they incorporate the tracking data at the very end of the rendering
process just before the buffer swap, achieving very low apparent input
latency relative to when the frame is going to be scanned out. But the
display latency (the time from buffer swap to pixels changing on the
screen) is unaffected by that.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Björn Blissing

Jan Ciger wrote:
 
 Yes, but not consistently for every frame. Assuming that each frame you are 
 drawing takes the same time, you need to get lucky for that to happen and 
 then the next frame is going to have a much higher latency.


Of course! I am certainly not claiming that hitting optimal latency is possible 
in every frame. 


Jan Ciger wrote:
 When you do get lucky, the next frame buffer swap will happen slightly 
 earlier now relative to the start of the scanout, because you didn't start 
 drawing after the vsync event but earlier. This will keep going on for 
 several frames, shifting the buffer swap point earlier and earlier relative 
 to the scanout cycle until you, again, hit that sweet spot where you measure 
 the low latency. Your measurement shows this beating between the 
 scanout/display update frequency and your buffer swaps (when you are starting 
 the timer, I presume) quite nicely. 


Yes, this pretty much shows the phase difference between the frequency of frame 
calls from the CPU and the screen refresh frequency.


Jan Ciger wrote:
 The best you can do from the latency point of view is to synchronize to the 
 scanout using the fences. Then you will have always at most one frame latency 
 (the GPU is scanning out the previous frame from the front buffer while you 
 are drawing the current one in the back buffer) if you can fit your drawing 
 calls within the duration of a single frame.


Yes, this was my goal, but the problem was that I initially saw much higher 
values and was trying to get then down to around one frame of added latency 
when using Vsync On. The NVIDIA settings are not that helpful and even with the 
SYNC_SWAP_BUFFERS I still is unable to hit as low latencies as if I use the 
custom settings (whose inner workings are hidden).

My example is very synthetic and very far from normal rendering, but my idea 
was to find any latency bottlenecks which was caused by application- and driver 
settings.


Jan Ciger wrote:
 Don't get confused by the late latching tricks that Oculus is promoting. 
 That has nothing to do with display latency but user input/tracking one. 
 There they incorporate the tracking data at the very end of the rendering 
 process just before the buffer swap, achieving very low apparent input 
 latency relative to when the frame is going to be scanned out. But the 
 display latency (the time from buffer swap to pixels changing on the screen) 
 is unaffected by that.
 


Valve is using another idea, which Alex Vlachos presented at GDC. They start 
rendering of a new frame 2 ms before VSync. Using some clever tricks to detect 
when Vsync is about to occur:

See page 14-23 in the PDF at:
http://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf

Regards
Björn

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63462#63462





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Enabling Vsync gives dramatic increase in latency

2015-04-17 Thread Björn Blissing

Jan Ciger wrote:
 
 What happens is that sometimes your program gets lucky and tells the GPU to 
 swap buffers just in time before the start of the next frame - then you 
 have very little latency, because the change gets visible almost immediately 
 (modulo the input latency of the monitor above). On the other hand, sometimes 
 you get unlucky, you swap buffers right after the scanout of the framebuffer 
 has started and then the GPU will hold your image until the next frame cycle 
 - poof, one frame of latency extra ... And you can have everything in between 
 these two extreme cases.


Yes, this is pretty similar to my theory but as you say, it happens inside the 
GPU and not the screen (as I erroneously guessed). 

Looking at my data from the Vsync Off case supports this. Since I render in 
much higher resolution than the screen refresh rate the process can be 
described as almost stochastic. In best case I will switch just in time and get 
0 ms latency + the screen scanout time. In worst case I will miss the frame and 
get 16 ms + the screen scanout time. But most cases will be somewhere in 
between, i.e. average 8 ms +  the screen scanout time. This correlates nicely 
with my data. The minimum latency time was 6 ms, ie screen scanout time is 
probably ~6ms. The mean time was 14ms which is 8 + 6 ms. And the max latency 
was 21 ms which is slightly less than 6+16ms. 

But my main point still stands; it is possible to record latencies that are 
close to the scanout time of the screen with VSync Off (albeit for very simple 
rendering).

/Björn

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63460#63460





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StatsHandler missing numbers

2015-04-17 Thread Trajce Nikolov NICK
Hi Andreas,

I have pretty new Intel board on my laptop and got into few problems with
the StatsHandler - just accidentally I changed the threading model of the
Viewer and I got it back to normal. This is a hint no clue why it is
happening but maybe worth of try

cheers,
Nick

On Fri, Apr 17, 2015 at 3:31 PM, Christian Buchner 
christian.buch...@gmail.com wrote:


 One major problem with the Intel drivers is that their products are
 typically EOL'ed (end-of-lifed) three years after first release - at which
 point they stop receiving driver updates (not even bug fixes).

 You can see the problem with the integrated graphics in Intel's Sandy
 Bridge chips (introduced Q1 2011). You cannot get any OpenGL drivers dated
 newer than April 2014 - and some OpenGL features we require simply aren't
 working with this driver release.

 And so we have to tell some of our customers that their laptops have use
 Ivy Bridge or later microarchitecture - or be equipped with a separate
 dedicated graphics chip.

 Christian


 2015-04-17 15:23 GMT+02:00 Andreas Schreiber a...@online.de:

 Hi Émeric,

 thx for the respond and the idea ;)

 My driver version is:
 Intel(R) HD Graphics 4000

 Driver Version: 10.18.10.3958
 Operating System: Windows* 8.1 (6.3.9600)
 Installed DirectX* Version: 11.2
 Supported DirectX* Version: 11.0
 Shader Version: 5.0
 OpenGL* Version: 4.0
 OpenCL* Version: 1.2
 Physical Memory: 8036 MB
 Processor: Intel(R) Core(TM) i5-3320M CPU @ 2.60GHz
 Processor Speed: 2594 MHz
 Vendor ID: 8086
 Device ID: 0166
 Device Revision: 09

 * Processor Graphics Information *
 Processor Graphics in Use: Intel(R) HD Graphics 4000
 Video BIOS: 0.151
 Current Resolution: 1600 x 900


 I could not update it, yet. But when I do and it helps I post it ;)

 ...

 Thank you!

 Cheers,
 Andreas

 --
 Read this topic online here:
 http://forum.openscenegraph.org/viewtopic.php?p=63453#63453





 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org




-- 
trajce nikolov nick
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] StatsHandler missing numbers

2015-04-17 Thread Andreas Schreiber
Hi,

thx for the hint, sometimes they help ;). On my laptop it did not work, I tried 
every Threadmodel I could find in the viewer. No changes. 

But i realized that this error in fullscreen is just the lower 2/3 screen. In 
the top 1/3 the object are fine.

I tried to update my driver, but intel driver update said there are no new ones.
Just for my wlan...Telling me to download it and then saying oh you got 64 
bit, sry it works only on 32 -.-...


But thx to all the answers you posted ;) 
... 

Thank you!

Cheers,
Andreas

--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63464#63464





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org