Re: Wine, fullscreen applications, and RandR 1.2

2012-09-08 Thread Andy Ritger
On Sat, Sep 08, 2012 at 12:08:06AM -0700, Henri Verbeet wrote:
> On 8 September 2012 01:22, Andy Ritger  wrote:
> > In any case, there is an enthusiast community around immersive gaming; e.g.,
> >
> > http://www.nvidia.com/object/3d-vision-surround-technology.html
> >
> > In those cases, the content across the monitors is rendered as one
> > big screen, rather than rendered separately per display with different
> > view frustums.
> >
> > With RandR 1.1 + NVIDIA MetaModes, users could achieve that sort of
> > configuration with Wine.  They can't really do that now with Wine and
> > RandR 1.2.
> >
> How does that work in Windows? In a typical D3D9 application you would
> normally end up with one adapter per display. Unless the application
> is multihead aware, switching to fullscreen mode would just make the
> application fullscreen on the primary display. Does the display driver
> change how the displays are presented to the application?

Yes.  As I understand it: the NVIDIA Windows driver, to support those
sorts of immersive gaming configurations, presents one adapter to the
Windows OS, which in turn presents that one adapter to the application
(e.g., one 7680x1600 adapter, rather than 3 2560x1600 adapters).
This is conceptually similar to NVIDIA Linux driver's MetaMode + RandR
1.1 (or XF86VidMode).

There are definitely cases where you would want the application to
be multihead aware so that they can make more intelligent use of each
monitor.  But in the case of monitors with similar geometry, abstracting
the complexity of multiple monitors away from the application has had
some success.

Somewhat timely, a related article was just posted on phoronix:

http://www.phoronix.com/scan.php?page=news_item&px=MTE3OTQ

That article suggests that there should be some coordination between
fullscreen applications and window managers.  Since that would also
introduce complexity, and would be needed across multiple applications
(and multiple window managers), a suspect one or more helper libraries
to centralize and abstract the complexity might be a good approach.

Thanks for the feedback in this thread.  I'll give some thought to what
these sorts of helper libraries might look like.

Thanks,
- Andy






Re: Wine, fullscreen applications, and RandR 1.2

2012-09-08 Thread Henri Verbeet
On 8 September 2012 01:22, Andy Ritger  wrote:
> In any case, there is an enthusiast community around immersive gaming; e.g.,
>
> http://www.nvidia.com/object/3d-vision-surround-technology.html
>
> In those cases, the content across the monitors is rendered as one
> big screen, rather than rendered separately per display with different
> view frustums.
>
> With RandR 1.1 + NVIDIA MetaModes, users could achieve that sort of
> configuration with Wine.  They can't really do that now with Wine and
> RandR 1.2.
>
How does that work in Windows? In a typical D3D9 application you would
normally end up with one adapter per display. Unless the application
is multihead aware, switching to fullscreen mode would just make the
application fullscreen on the primary display. Does the display driver
change how the displays are presented to the application?




Re: Wine, fullscreen applications, and RandR 1.2

2012-09-07 Thread Andy Ritger
On Thu, Sep 06, 2012 at 01:05:42AM -0700, Henri Verbeet wrote:
> On 5 September 2012 22:45, Andy Ritger  wrote:
> > On Wed, Sep 05, 2012 at 11:26:23AM -0700, Henri Verbeet wrote:
> >> From Wine's point of view, we'd just get a bunch of extra code to maintain
> >> because nvidia does things differently from everyone else.
> >
> > Eventually, I hope NVIDIA isn't unique about this approach to viewport
> > configuration.  The drawbacks aren't NVIDIA-specific.
> >
> > The concern about added code maintenance to Wine is fair; is that concern
> > lessened if the details of viewport configuration are abstracted by a new
> > standard library?
> >
> A little, but adding extra dependencies isn't without its costs
> either. You'd also have to wait a fair bit before it ends up in common
> distributions. (And that means things like "Debian stable" or "RHEL",
> not "Ubuntu & Fedora".)

Yes; understood.

> >> Perhaps there's a use case for a "big screen" setup, but that too is
> >> something that's probably best handled on the RandR / X server level
> >> instead of Wine.
> >
> > I don't think we can have it both ways:
> >
> > * RandR 1.1 gives you one "big screen" per X screen; the user can
> >   configure what is within that big screen via NVIDIA's MetaModes.
> >
> > * RandR 1.2 gives applications control of each individual CRTC/output.
> >
> > Are you suggesting we go back to something more like RandR 1.1?
> >
> I imagine you could configure things in e.g. xorg.conf so that the
> displays show up as a single CRTC / output instead of individual
> CRTCs, perhaps depending on how the MetaModes are setup. Or perhaps
> there's some way to make this fit it with some of the new
> functionality in RandR 1.5. The main point though is that there's no
> reasonable way for Wine to decide to do one or the other
> automatically. We could of course add a configuration option for that,
> but at that point it makes more sense to do it globally for all
> applications in RandR or the X server.

RandR 1.2+ intentionally gives control of the individual CRTCs/outputs
to client applications, with the goal of moving display configuration
policy out of the X server.  But I agree it is a burden for every
fullscreen application to have to decide for itself how to configure
the available CRTCs/outputs.

Here again maybe some helper library can help bridge the gap.

> >> I don't think you can actually do "immersive gaming"
> >> properly without support from the application though, you'll get
> >> fairly significant distortion at the edges if you just render to such
> >> a setup as if it was a single very wide display.
> >
> > I'm sorry; I don't understand the distortion concern.  Are you referring
> > to the bezel of the monitor occupying physical space, but not pixel space
> > in the X screen?  I believe people often address that by configuring
> > "dead space" in the X screen between their monitors.
> >
> No, I mean the distortion caused by perspective projection, because
> you're essentially projecting a spherical "world" onto a flat screen.
> An application that is aware of this could e.g. use a separate camera
> for each display to mitigate this, or perhaps adjust the projection
> matrix.

Maybe there is a distortion concern in theory (though it may depend on
whether the monitors are all in the same plane or tilted to "wrap around"
the user), but in practice it seems to normally work out well enough.
I suppose maybe users use side displays for peripheral vision, and then
move their viewport such that the content they care about is in the
center display.

In any case, there is an enthusiast community around immersive gaming; e.g.,

http://www.nvidia.com/object/3d-vision-surround-technology.html

In those cases, the content across the monitors is rendered as one
big screen, rather than rendered separately per display with different
view frustums.

With RandR 1.1 + NVIDIA MetaModes, users could achieve that sort of
configuration with Wine.  They can't really do that now with Wine and
RandR 1.2.

> > Since we have some differing viewpoints that won't be quickly resolved,
> > how about as a compromise we add a way for users to force Wine from
> > RandR 1.2 back to RandR 1.1?  That would at least let users achieve
> > some of the configurations they cannot configure with top of tree.
> > If that seems fair, what is the preferred configuration mechanism to
> > do that?  Just a simple environment variable?
> >
> You could perhaps extend the existing "UseXRandR" registry setting
> (see dlls/winex11.drv/x11drv_main.c) to take a version number. It's
> not something we've heard a lot of users about though.

Sounds good; I'll look into that.

Thanks,
- Andy






Re: Wine, fullscreen applications, and RandR 1.2

2012-09-06 Thread Henri Verbeet
On 5 September 2012 22:45, Andy Ritger  wrote:
> On Wed, Sep 05, 2012 at 11:26:23AM -0700, Henri Verbeet wrote:
>> From Wine's point of view, we'd just get a bunch of extra code to maintain
>> because nvidia does things differently from everyone else.
>
> Eventually, I hope NVIDIA isn't unique about this approach to viewport
> configuration.  The drawbacks aren't NVIDIA-specific.
>
> The concern about added code maintenance to Wine is fair; is that concern
> lessened if the details of viewport configuration are abstracted by a new
> standard library?
>
A little, but adding extra dependencies isn't without its costs
either. You'd also have to wait a fair bit before it ends up in common
distributions. (And that means things like "Debian stable" or "RHEL",
not "Ubuntu & Fedora".)

>> Perhaps there's a use case for a "big screen" setup, but that too is
>> something that's probably best handled on the RandR / X server level
>> instead of Wine.
>
> I don't think we can have it both ways:
>
> * RandR 1.1 gives you one "big screen" per X screen; the user can
>   configure what is within that big screen via NVIDIA's MetaModes.
>
> * RandR 1.2 gives applications control of each individual CRTC/output.
>
> Are you suggesting we go back to something more like RandR 1.1?
>
I imagine you could configure things in e.g. xorg.conf so that the
displays show up as a single CRTC / output instead of individual
CRTCs, perhaps depending on how the MetaModes are setup. Or perhaps
there's some way to make this fit it with some of the new
functionality in RandR 1.5. The main point though is that there's no
reasonable way for Wine to decide to do one or the other
automatically. We could of course add a configuration option for that,
but at that point it makes more sense to do it globally for all
applications in RandR or the X server.

>> I don't think you can actually do "immersive gaming"
>> properly without support from the application though, you'll get
>> fairly significant distortion at the edges if you just render to such
>> a setup as if it was a single very wide display.
>
> I'm sorry; I don't understand the distortion concern.  Are you referring
> to the bezel of the monitor occupying physical space, but not pixel space
> in the X screen?  I believe people often address that by configuring
> "dead space" in the X screen between their monitors.
>
No, I mean the distortion caused by perspective projection, because
you're essentially projecting a spherical "world" onto a flat screen.
An application that is aware of this could e.g. use a separate camera
for each display to mitigate this, or perhaps adjust the projection
matrix.

> Since we have some differing viewpoints that won't be quickly resolved,
> how about as a compromise we add a way for users to force Wine from
> RandR 1.2 back to RandR 1.1?  That would at least let users achieve
> some of the configurations they cannot configure with top of tree.
> If that seems fair, what is the preferred configuration mechanism to
> do that?  Just a simple environment variable?
>
You could perhaps extend the existing "UseXRandR" registry setting
(see dlls/winex11.drv/x11drv_main.c) to take a version number. It's
not something we've heard a lot of users about though.




Re: Wine, fullscreen applications, and RandR 1.2

2012-09-05 Thread Andy Ritger
On Wed, Sep 05, 2012 at 11:26:23AM -0700, Henri Verbeet wrote:
> On 5 September 2012 18:52, Andy Ritger  wrote:
> > At first glance, I agree that would be easier for applications, but that
> > approach has some drawbacks:
> >
> > * lies to the user/application about what timings are actually being
> >   driven to the monitor
> > * the above bullet causes confusion: the timings reported in the monitor's
> >   on screen display don't match what is reported by the X server
> > * user/application doesn't get complete control over what actual timings
> >   are being sent to the monitor
> > * does not provide the full flexibility of the hardware to configure,
> >   e.g., arbitrary position of the ViewPortOut within the active raster
> >
> Perhaps, but none of that changes, as far as Win32 applications are
> concerned, if we generate modes in Wine instead of in the kernel.

Agreed.

> From Wine's point of view, we'd just get a bunch of extra code to maintain
> because nvidia does things differently from everyone else.

Eventually, I hope NVIDIA isn't unique about this approach to viewport
configuration.  The drawbacks aren't NVIDIA-specific.

The concern about added code maintenance to Wine is fair; is that concern
lessened if the details of viewport configuration are abstracted by a new
standard library?

> > I imagine counter arguments include:
> >
> > * we already have the "scaling mode" output property in most drivers;
> >   that is good enough
> > * Transformation matrix and Border are too low level for most applications
> >
> > For the first counter argument: I'm trying to make the case that
> > providing the full flexibility, and being truthful about modetimings to
> > users/applications, is valuable enough to merit a change (hopefully even
> > in the drivers that currently expose a "scaling mode" output property).
> >
> I must say that I'm having some trouble imagining what not generating
> standard modes will allow someone to do that they couldn't do before.
> In terms of figuring out the "real" timings, the RandR "preferred"
> mode is probably close enough, but I suppose it should be fairly easy
> to extend RandR to explicitly mark specific modes as "native". I
> imagine that for most applications it's just an implementation detail
> whether the display panel has a scaler itself, or if that's done by
> the GPU though. Either way, that seems like a discussion more
> appropriate for e.g. dri-devel.

Fair enough; I'll discuss with the other drivers, first.

> Perhaps there's a use case for a "big screen" setup, but that too is
> something that's probably best handled on the RandR / X server level
> instead of Wine.

I don't think we can have it both ways:

* RandR 1.1 gives you one "big screen" per X screen; the user can
  configure what is within that big screen via NVIDIA's MetaModes.

* RandR 1.2 gives applications control of each individual CRTC/output.

Are you suggesting we go back to something more like RandR 1.1?

> I don't think you can actually do "immersive gaming"
> properly without support from the application though, you'll get
> fairly significant distortion at the edges if you just render to such
> a setup as if it was a single very wide display.

I'm sorry; I don't understand the distortion concern.  Are you referring
to the bezel of the monitor occupying physical space, but not pixel space
in the X screen?  I believe people often address that by configuring
"dead space" in the X screen between their monitors.

In any case, my impression is that multi-monitor fullscreen gaming is
not an uncommon use case.

> (Also, uneven numbers
> of displays are probably more useful for such a thing than even
> numbers of displays.)

Agreed.


Since we have some differing viewpoints that won't be quickly resolved,
how about as a compromise we add a way for users to force Wine from
RandR 1.2 back to RandR 1.1?  That would at least let users achieve
some of the configurations they cannot configure with top of tree.
If that seems fair, what is the preferred configuration mechanism to
do that?  Just a simple environment variable?

Thanks,
- Andy





Re: Wine, fullscreen applications, and RandR 1.2

2012-09-05 Thread Henri Verbeet
On 5 September 2012 18:52, Andy Ritger  wrote:
> At first glance, I agree that would be easier for applications, but that
> approach has some drawbacks:
>
> * lies to the user/application about what timings are actually being
>   driven to the monitor
> * the above bullet causes confusion: the timings reported in the monitor's
>   on screen display don't match what is reported by the X server
> * user/application doesn't get complete control over what actual timings
>   are being sent to the monitor
> * does not provide the full flexibility of the hardware to configure,
>   e.g., arbitrary position of the ViewPortOut within the active raster
>
Perhaps, but none of that changes, as far as Win32 applications are
concerned, if we generate modes in Wine instead of in the kernel. From
Wine's point of view, we'd just get a bunch of extra code to maintain
because nvidia does things differently from everyone else.

> I imagine counter arguments include:
>
> * we already have the "scaling mode" output property in most drivers;
>   that is good enough
> * Transformation matrix and Border are too low level for most applications
>
> For the first counter argument: I'm trying to make the case that
> providing the full flexibility, and being truthful about modetimings to
> users/applications, is valuable enough to merit a change (hopefully even
> in the drivers that currently expose a "scaling mode" output property).
>
I must say that I'm having some trouble imagining what not generating
standard modes will allow someone to do that they couldn't do before.
In terms of figuring out the "real" timings, the RandR "preferred"
mode is probably close enough, but I suppose it should be fairly easy
to extend RandR to explicitly mark specific modes as "native". I
imagine that for most applications it's just an implementation detail
whether the display panel has a scaler itself, or if that's done by
the GPU though. Either way, that seems like a discussion more
appropriate for e.g. dri-devel.

> When the RandR primary output (as queried/set by RR[SG]etOutputPrimary)
> is non-None, then its CRTC will be sorted to the front of the CRTCs
> list reported by RRGetScreenResources{,Current}.  However, None is a
> valid value for the primary output, in which case all bets are off wrt
> CRTC/output sorting order in the RRGetScreenResources{,Current} reply.
>
Yes, as I said this is something we'll probably address at some point.

> Further, while RandR primary output seems like a reasonable default,
> the spec spells out a focus on window manager (e.g., "primary" is where
> the menu bar should be placed).  It seems like a valid use case would
> be for the user to have his window manager primary output on one monitor,
> but run his full screen Wine application on another monitor.  Given that,
> would it be reasonable for the user to specify the RandR output he wants
> Wine to use?
>
We can probably add an override if there's a lot of demand. It doesn't
strike me as a very common use case though.

> I can definitely believe that plumbing RandR outputs to multiple objects
> in Win32 is not an important/compelling use case, since not many Win32
> applications would do useful things with that.  What seems more useful,
> though, is driving multiple RandR outputs and presenting that to Win32 as
> a single big screen.  E.g., "immersive gaming" where your Wine application
> spans two, three, or more RandR outputs (NVIDIA Kepler GPUs can have up
> to four heads).
>
Perhaps there's a use case for a "big screen" setup, but that too is
something that's probably best handled on the RandR / X server level
instead of Wine. I don't think you can actually do "immersive gaming"
properly without support from the application though, you'll get
fairly significant distortion at the edges if you just render to such
a setup as if it was a single very wide display. (Also, uneven numbers
of displays are probably more useful for such a thing than even
numbers of displays.)




Re: Wine, fullscreen applications, and RandR 1.2

2012-09-05 Thread Andy Ritger

Thanks, Henri.

On Wed, Sep 05, 2012 at 01:34:47AM -0700, Henri Verbeet wrote:
> On 5 September 2012 08:07, Andy Ritger  wrote:
> > Questions:
> >
> > * Looking at dlls/winex11.drv/xrandr.c, the first RandR CRTC/output's
> >   modelist is used to populate Wine's list of available modes.  Is the
> >   data flow between Wine and Windows applications always such that you
> >   need to advertise a list of (width, height, refreshRate)s?  Or would
> >   an application ever tell Wine what resolution it wants?
> >
> Windows applications use EnumDisplaySettingsEx() to query supported
> modes, and ChangeDisplaySettingsEx() to set one. Applications can't
> make up modes on their own.

Thanks for clarifying.

> > * Would you be open to patches to make dlls/winex11.drv/xrandr.c generate
> >   a larger set of (width, height, refreshRate)s, and then have
> >   xrandr12_set_current_mode() use RandR transformation matrix and Border
> >   property to satisfy those?  I was envisioning something where we take
> >   the "preferred" mode for the RandR output, and create all of the
> >   following resolutions using ViewPort{In,Out}:
> >
> > 1920 x 1200
> > 1920 x 1080
> > 1600 x 1200
> > 1280 x 1024
> > 1280 x 720
> > 1024 x 768
> > 800 x 600
> > 640 x 480
> >
> It's ultimately not up to me whether such a patch would be accepted,
> but it's not something I would be particularly happy about. I think
> the preferred way to handle this would be to generate the standard DMT
> etc. modes in the kernel, and use the "scaling mode" output property
> to control the scaling mode, pretty much like all the other drivers.

At first glance, I agree that would be easier for applications, but that
approach has some drawbacks:

* lies to the user/application about what timings are actually being
  driven to the monitor
* the above bullet causes confusion: the timings reported in the monitor's
  on screen display don't match what is reported by the X server
* user/application doesn't get complete control over what actual timings
  are being sent to the monitor
* does not provide the full flexibility of the hardware to configure,
  e.g., arbitrary position of the ViewPortOut within the active raster

I imagine counter arguments include:

* we already have the "scaling mode" output property in most drivers;
  that is good enough
* Transformation matrix and Border are too low level for most applications

For the first counter argument: I'm trying to make the case that
providing the full flexibility, and being truthful about modetimings to
users/applications, is valuable enough to merit a change (hopefully even
in the drivers that currently expose a "scaling mode" output property).

For the second counter argument: maybe the viewport configuration
belongs in a library rather than directly in applications like Wine.
In that case, I'd like to build a better understanding of Wine's needs
so that I could properly design the API to such a library.

> > * The current xrandr.c code picks the first CRTC/output, which may not
> >   be currently active.  At the least, it should scan for an active
> >   CRTC+output.  I imagine it would be even better if the user could
> >   configure which RandR output they want.  Would that be reasonable?  What
> >   mechanisms are available in Wine for users to provide runtime 
> > configuration?
> >
> The RandR primary display should be CRTC 0, output 0.

That is true most of the time, but I don't believe it is strictly mandated
by the RandR specification:

http://cgit.freedesktop.org/xorg/proto/randrproto/tree/randrproto.txt

When the RandR primary output (as queried/set by RR[SG]etOutputPrimary)
is non-None, then its CRTC will be sorted to the front of the CRTCs
list reported by RRGetScreenResources{,Current}.  However, None is a
valid value for the primary output, in which case all bets are off wrt
CRTC/output sorting order in the RRGetScreenResources{,Current} reply.

Further, while RandR primary output seems like a reasonable default,
the spec spells out a focus on window manager (e.g., "primary" is where
the menu bar should be placed).  It seems like a valid use case would
be for the user to have his window manager primary output on one monitor,
but run his full screen Wine application on another monitor.  Given that,
would it be reasonable for the user to specify the RandR output he wants
Wine to use?

> Users can
> typically change this through xrandr or xorg.conf. Unfortunately not
> all drivers do something reasonable by default here, so we'll probably
> add code to pick the first connected display as Win32 primary instead
> if no primary is defined through RandR. For the moment we end up
> falling back to the older RandR version though, so at least the
> behaviour isn't any worse than before.
>
> > * From the current code, it does not look like Wine's RandR support tries
> >   to do anything with multiple simultaneous RandR outputs.
> >
> Yes, pr

Re: Wine, fullscreen applications, and RandR 1.2

2012-09-05 Thread Henri Verbeet
On 5 September 2012 08:07, Andy Ritger  wrote:
> Questions:
>
> * Looking at dlls/winex11.drv/xrandr.c, the first RandR CRTC/output's
>   modelist is used to populate Wine's list of available modes.  Is the
>   data flow between Wine and Windows applications always such that you
>   need to advertise a list of (width, height, refreshRate)s?  Or would
>   an application ever tell Wine what resolution it wants?
>
Windows applications use EnumDisplaySettingsEx() to query supported
modes, and ChangeDisplaySettingsEx() to set one. Applications can't
make up modes on their own.

> * Would you be open to patches to make dlls/winex11.drv/xrandr.c generate
>   a larger set of (width, height, refreshRate)s, and then have
>   xrandr12_set_current_mode() use RandR transformation matrix and Border
>   property to satisfy those?  I was envisioning something where we take
>   the "preferred" mode for the RandR output, and create all of the
>   following resolutions using ViewPort{In,Out}:
>
> 1920 x 1200
> 1920 x 1080
> 1600 x 1200
> 1280 x 1024
> 1280 x 720
> 1024 x 768
> 800 x 600
> 640 x 480
>
It's ultimately not up to me whether such a patch would be accepted,
but it's not something I would be particularly happy about. I think
the preferred way to handle this would be to generate the standard DMT
etc. modes in the kernel, and use the "scaling mode" output property
to control the scaling mode, pretty much like all the other drivers.

> * The current xrandr.c code picks the first CRTC/output, which may not
>   be currently active.  At the least, it should scan for an active
>   CRTC+output.  I imagine it would be even better if the user could
>   configure which RandR output they want.  Would that be reasonable?  What
>   mechanisms are available in Wine for users to provide runtime configuration?
>
The RandR primary display should be CRTC 0, output 0. Users can
typically change this through xrandr or xorg.conf. Unfortunately not
all drivers do something reasonable by default here, so we'll probably
add code to pick the first connected display as Win32 primary instead
if no primary is defined through RandR. For the moment we end up
falling back to the older RandR version though, so at least the
behaviour isn't any worse than before.

> * From the current code, it does not look like Wine's RandR support tries
>   to do anything with multiple simultaneous RandR outputs.
>
Yes, proper multihead support is something we still need to implement.
There aren't a lot of Win32 applications that do something useful with
multiple displays though, so it's not something that has a very high
priority at the moment.

>   Ironically:
>   this actually works better with RandR 1.1 + NVIDIA: users can configure
>   their MetaModes to describe what mode (plus viewport configuration)
>   they want on each monitor, and then RandR 1.1 chooses the MetaMode.
>
No. With RandR 1.1 you get one big screen, and you can choose between
getting fullscreen applications stretched across all your displays, or
turning off all displays except one. What you actually want is for the
application to be fullscreen on a specific display, or multiple
displays if the application supports that, and leave everything else
alone.

As an aside, the fake refresh rates generated by "DynamicTwinView"
aren't very helpful either. Some Win32 applications expect modes like
800x600@60Hz or 1024x768@60Hz to always exist, and just die if they
don't.




Wine, fullscreen applications, and RandR 1.2

2012-09-04 Thread Andy Ritger

Hello wine developers,

I work on NVIDIA's Linux graphics driver team, and have a few questions
about how Wine should interact with X driver mode lists.  Sorry if this
isn't the correct forum for these questions.

Starting in release 302.xx, we finally added RandR 1.2 support to NVIDIA's
X driver.  At the same time, we reworked some things about how modetimings
are validated and configured by the NVIDIA X driver.  For the following,
the important part is that we eliminated implicit flat panel scaling,
and instead made it explicitly configurability through NVIDIA's MetaMode
syntax and through RandR 1.2.  The user-visible change is that only
modes reported by a digital flat panel's EDID are in the mode list for
an RandR output.

A little background: modern GPUs (at least NVIDIA, and I expect other
vendors) have a flexible display scaling pipeline that consists of:

* A "RasterSize": this is the resolution of pixels that will be sent
  to the monitor; this is what people normally think of as the size
  of the "mode".

* A "ViewPortIn": this is the resolution of pixels that the display
  engine will fetch from the X screen.

* A "ViewPortOut": this is the region _within_ the RasterSize to
  which the pixels of ViewPortIn should be sent.  The pixels fetched
  in ViewPortIn can be scaled up or down by specifying different
  sizes for ViewPortIn and ViewPortOut.  Also, letterboxing and
  overscan compensation can be configured by making ViewPortOut
  smaller than RasterSize.

For example, if your monitor accepts a mode of 1920x1200, and you want
a desktop resolution of 1280x720 aspect-scaled to fill 1920x1200 (i.e.,
1280x720 scaled to 1920x1080, with 60 blank scanlines above and below
it in a 1920x1200 mode), then in NVIDIA MetaMode syntax you could do:

"1920x1200 { ViewPortIn = 1280x720, ViewPortOut = 1920x1080+0+60 }"

While the MetaMode syntax is NVIDIA-specific, the same can be configured
through RandR:

* The RandR transformation matrix can be used to describe the scaling
  between ViewPortOut and ViewPortIn.

* The RandR "Border" output property can be used, when available,
  to describe a ViewPortOut that is smaller than the RasterSize.
  Note: most RandR X drivers, including NVIDIA, don't yet provide
  this property, though I'm in the process of adding it for NVIDIA.

In practice this means that most resolutions (i.e., not just a fixed
modelist), from the perspective of desktop size, are achievable with
RandR.

Questions:

* Looking at dlls/winex11.drv/xrandr.c, the first RandR CRTC/output's
  modelist is used to populate Wine's list of available modes.  Is the
  data flow between Wine and Windows applications always such that you
  need to advertise a list of (width, height, refreshRate)s?  Or would
  an application ever tell Wine what resolution it wants?

* Would you be open to patches to make dlls/winex11.drv/xrandr.c generate
  a larger set of (width, height, refreshRate)s, and then have
  xrandr12_set_current_mode() use RandR transformation matrix and Border
  property to satisfy those?  I was envisioning something where we take
  the "preferred" mode for the RandR output, and create all of the
  following resolutions using ViewPort{In,Out}:

1920 x 1200
1920 x 1080
1600 x 1200
1280 x 1024
1280 x 720
1024 x 768
800 x 600
640 x 480

* The current xrandr.c code picks the first CRTC/output, which may not
  be currently active.  At the least, it should scan for an active
  CRTC+output.  I imagine it would be even better if the user could
  configure which RandR output they want.  Would that be reasonable?  What
  mechanisms are available in Wine for users to provide runtime configuration?

* From the current code, it does not look like Wine's RandR support tries
  to do anything with multiple simultaneous RandR outputs.  Ironically:
  this actually works better with RandR 1.1 + NVIDIA: users can configure
  their MetaModes to describe what mode (plus viewport configuration)
  they want on each monitor, and then RandR 1.1 chooses the MetaMode.

  Would it make sense for the user to be able to specify to Wine the
  RandR configuration (spanning all of the RandR outputs on the X screen)?
  I guess that depends on what runtime configuration mechanisms are
  possible.

I'm curious what you guys think, before I code up any patches to propose.

Thanks,
- Andy Ritger