Am 10.12.2009 um 23:15 schrieb Henri Verbeet:

> 2009/12/10 Stefan Dösinger <ste...@codeweavers.com>:
>> @@ -6304,7 +6304,7 @@ HRESULT create_primary_opengl_context(IWineD3DDevice 
>> *iface, IWineD3DSwapChain *
> ...
>> -    swapchain->context[0]->render_offscreen = swapchain->render_to_fbo;
>> +    swapchain->context[0]->render_offscreen = surface_is_offscreen(target);
> This will introduce a warning, surface_is_offscreen() takes an
> IWineD3DSurface pointer.
Woops, I guess I should not use my own gcc all the time which chokes on the OSX 
headers and writes warning spam :-/

>   - Setting "render_offscreen" is really the responsibility of the
> context_create() call, similar to create_primary_opengl_context().
context_create() doesn't take a swapchain as parameter, only a window. This is 
correct, because GL contexts are linked to the drawable, not the swapchain. 
Thus context_create doesn't have the needed information to set the offscreen 
flag. (I could just pass in a TRUE or FALSE parameter, but there's something 
else)

The reason why this line is needed however is because the context has a last 
render target set, and if the app never changes the render target the 
sorta-cached render-offscreen value never gets set properly. Maybe we should 
make sure that the full FindContext() runs at least once for a newly created 
context, then we can drop this line and the one in 
create_primary_opengl_context(). Should keeping the last_rt set to NULL after 
context creation work correctly?

As a sidenode, maybe create_primary_opengl_context is named badly. It's not 
part of the context management, its a device helper function that happens to 
use the context management to set up a GL context, among other things. Maybe 
device_restart_opengl or something similar is better.

>  - The line is actually wrong, the relevant context has been setup to
> render to the front buffer, so "render_offscreen" should always be
> FALSE, even if there is a back buffer.
Well, this code is in the backbuffer path. Render_offscreen is false if we 
don't set it due to the alloc HEAP_ZERO_MEMORY. The point of the line is to set 
it to true if we render to an FBO. We never set a swapchain to render to the 
front buffer at creation time if there is a back buffer.

> 
>  - It doesn't make sense for "render_to_fbo" to be TRUE for a
> swapchain without back buffer.
Correct, but this needs another patch to fix.

First of all, render_to_fbo shouldn't matter when we're dealing with the front 
buffer, even when a back buffer is there. surface_is_offscreen takes care of 
that, but we have to use it. This is important when we're forcing render_to_fbo 
on for driver bug workarounds, or if we have a double buffered environment with 
render_to_fbo on for a real(non-driver) reason and access the front buffer due 
to some ddraw/d3d9ex tricks or wined3d internal stuff.

In the ddraw windowed mode rendering situation rendering the backbuffer to a 
FBO is actually the correct thing to do, because from the application's point 
of view the backbuffer is an offscreen plain surface. We just construct a 
double-buffered GL setup to be able to use windowed ddraw rendering without 
FBOs on older drivers. Currently this triggers a few other bugs(e.g. in GetDC() 
+ Client_storage) that I want to fix first.

So the next step after fixing the GL errors and the GetDC bug is to ignore the 
back buffer size in CreateSwapchain and Reset if there is no backbuffer. Then 
render_to_fbo will be false for single buffered setups.

The academic question afterwards is if we render windowed d3d7 setups to an FBO 
if available or not. If we have FBOs, the difference is small. I haven't seen 
an app that depends on this yet, and unless we are on MacOS 10.5 or earlier 
which cannot write to the front buffer I don't expect driver issues either. 
This whole thing is done do please drivers without FBOs.

So once the other bugs are fixed I'll add a check to CreateSwapchain and reset 
to ignore the backbuffer size if backbuffercount == 0




Reply via email to