Re: How do we want to deal with 4k tiled displays?

2014-06-11 Thread Dave Airlie
On 17 January 2014 05:11, Aaron Plattner  wrote:
> So, monitor manufacturers are starting to make high-resolution displays that
> consist of one LCD panel that appears to the PC as two.  The one I've got is a
> Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2 multistream 
> devices
> that have the same GUID but different EDIDs.  There's an extension block in 
> the
> EDID that's supposed to indicate which side is the left tile and which is the
> right, though I haven't tried to decode it yet.
>
> The problem, obviously, is that applications (including some games) treat the
> two tiles as if they were completely separate monitors.  Windows maximize to
> only half of the screen.  My question is, how do we want to deal with these
> monitors?
>
> As far as I see it, we have four options:
>
>  1. Hide the presence of the second tile in the X server.
>
> Somehow combine the two tiles into a single logical output at the RandR
> protocol level.  The X server would be responsible for setting up the 
> right
> configuration to drive the logical output using the correct physical
> resources.
>
>  2. Hide the presence of the second tile in libXrandr.
>
> This would allow interested applications to query the real state of the
> hardware while also making it easier to do modesets on a per-monitor level
> rather than per-output.
>
> This could be exposed either as a new "simple" modeset API in libXrandr or
> similar, or by modifying the existing interface and having a new interface
> to punch through the façade and get at the real configuration, for clients
> that care.
>
>  3. Update every application that uses RandR 1.2.
>
> Applications can detect the presence of these monitors and deal with them
> themselves, but this might have poor adoption because programmers are a 
> lazy
> bunch in general.
>
>  4. Do nothing and hope the problem goes away.
>
> Hopefully, the situation with current 4k monitors is temporary and we'll
> start seeing single-tile 4k displays soon, fixing the problem "forever".
> Until we get 8k tiled displays.
>
> If the real output devices are still exposed through the protocol, it might 
> make
> sense to add new properties describing their relative positions to make it
> easier for clients to lay them out in the right order.  This might be useful 
> for
> power-walls too.
>
> The problem with the first two options is that driving these monitors consumes
> two crtcs.  If we present them as a single output to applications, they'll 
> make
> the assumption that they can just assign a single crtc to that output and use
> the remaining crtcs for something else.  I suspect that deleting crtcs or
> otherwise marking them as used as a side effect of setting a mode on a 
> different
> crtc is going to explode a lot of existing applications.
>
> ~~
>
> Regardless of what we do about the current crop of 4k monitors, one feature I
> would like to add is a standardized OutputGroup property.  Multiple outputs 
> with
> the same value of OutputGroup should be considered (both by clients and the
> server) as a single logical monitor.  This would affect the Xinerama 
> information
> presented by rrxinerama.c, and window managers that use RandR 1.2 directly 
> would
> be encouraged to consider output groups in their UI behavior.
>
> The X server could configure OutputGroups automatically when setting up the
> initial configuration based on the presence of tiled displays, and clients 
> could
> reconfigure the groups at runtime to get different behavior if desired.
>
> Does this sound like a reasonable extension to RandR?
>

wow its a bit messy no matter what we do here,

I am coming around to the fact that hiding this makes life harder for
things that might
want to know about it, i.e. color management and other EDID users,

But I do think we need some sort of randr improvements for output
grouping to handle this,

From the open source driver pov, I added a tile property to the KMS
outputs, that is essentially
: from the DisplayID protocol.

I then export this via randr as a tile property as well. I've hacked
up rrxinerama.c to take
note of these tile properties and it at least combines things into
one, so far I'm trying to limit
the ABI breakage for myself, so I've only hacked internally into rrxinerama.c

http://people.freedesktop.org/~airlied/scratch/0001-rrxinerama-hack-in-tiling-support.patch

I suspect I really want OutputGroup, any idea what an ideal
OutputGroup should contain?

Dave.
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-02-27 Thread Dave Airlie
On Tue, Jan 28, 2014 at 1:52 PM, Keith Packard  wrote:
> "Pierre-Loup A. Griffais"  writes:
>
>> It seems the X server should be responsible for both making sure the two
>> halves are laid out properly (right now they're in reverse order) and
>> hidden from clients. This database of quirks would ideally live in the
>> server itself where it can be easily leveraged by everyone shipping
>> DDXen.
>
> Yeah, at some level, it would have been good to separate the device
> configuration from the logical partition of the whole X screen into
> 'monitors'. One nice thing that would have easily allowed was the
> creation of virtual 'monitors' for vnc or even DisplayLink. As it is,
> we've managed (once again) to link a particular hardware interface to an
> application capability. Sigh.
>
> We do have such an abstraction today, called Xinerama, but it seemed
> like the right plan was to deprecate that and encourage applications to
> use RandR directly.

I'm also not sure how much of this the X server should be doing, we
are moving away from X servers,

I'd really like fbcon and wayland to work on 4k monitors, so at least
from my POV we should be hiding as much of this as possible in the
kernel drivers for KMS users.

Dave.
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: How do we want to deal with 4k tiled displays?

2014-01-27 Thread Keith Packard
"Pierre-Loup A. Griffais"  writes:

> It seems the X server should be responsible for both making sure the two 
> halves are laid out properly (right now they're in reverse order) and 
> hidden from clients. This database of quirks would ideally live in the 
> server itself where it can be easily leveraged by everyone shipping
> DDXen.

Yeah, at some level, it would have been good to separate the device
configuration from the logical partition of the whole X screen into
'monitors'. One nice thing that would have easily allowed was the
creation of virtual 'monitors' for vnc or even DisplayLink. As it is,
we've managed (once again) to link a particular hardware interface to an
application capability. Sigh.

We do have such an abstraction today, called Xinerama, but it seemed
like the right plan was to deprecate that and encourage applications to
use RandR directly.

Hindsight.

-- 
keith.pack...@intel.com


pgpRheIi5Tkcl.pgp
Description: PGP signature
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-27 Thread Pierre-Loup A. Griffais
I see lots of discussion has happened on that front already; sorry for 
being late to the party.


I'm currently wrestling against a Dell UP3214Q, which seems to exhibit 
all of the problems you refer to. There are many existing binary games 
out there that most likely will never get touched again that are RandR 
clients and try to either maximize themselves to one of the displays or 
ask the window manager to do it for them, after discovering the monitor 
topology. Some of these games ship their own libXRandR, some of them 
statically link against it, some of them inline the protocol code to 
take bug fixes that weren't widespread in distros at the time they were 
packaged. It's pretty bad. Given that the only viable solution seems to 
lie completely on the X server side of things.


It seems the X server should be responsible for both making sure the two 
halves are laid out properly (right now they're in reverse order) and 
hidden from clients. This database of quirks would ideally live in the 
server itself where it can be easily leveraged by everyone shipping DDXen.


Thanks,
 - Pierre-Loup

On 01/16/2014 11:11 AM, Aaron Plattner wrote:

So, monitor manufacturers are starting to make high-resolution displays that
consist of one LCD panel that appears to the PC as two.  The one I've got is a
Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2 multistream devices
that have the same GUID but different EDIDs.  There's an extension block in the
EDID that's supposed to indicate which side is the left tile and which is the
right, though I haven't tried to decode it yet.

The problem, obviously, is that applications (including some games) treat the
two tiles as if they were completely separate monitors.  Windows maximize to
only half of the screen.  My question is, how do we want to deal with these
monitors?

As far as I see it, we have four options:

  1. Hide the presence of the second tile in the X server.

 Somehow combine the two tiles into a single logical output at the RandR
 protocol level.  The X server would be responsible for setting up the right
 configuration to drive the logical output using the correct physical
 resources.

  2. Hide the presence of the second tile in libXrandr.

 This would allow interested applications to query the real state of the
 hardware while also making it easier to do modesets on a per-monitor level
 rather than per-output.

 This could be exposed either as a new "simple" modeset API in libXrandr or
 similar, or by modifying the existing interface and having a new interface
 to punch through the façade and get at the real configuration, for clients
 that care.

  3. Update every application that uses RandR 1.2.

 Applications can detect the presence of these monitors and deal with them
 themselves, but this might have poor adoption because programmers are a 
lazy
 bunch in general.

  4. Do nothing and hope the problem goes away.

 Hopefully, the situation with current 4k monitors is temporary and we'll
 start seeing single-tile 4k displays soon, fixing the problem "forever".
 Until we get 8k tiled displays.

If the real output devices are still exposed through the protocol, it might make
sense to add new properties describing their relative positions to make it
easier for clients to lay them out in the right order.  This might be useful for
power-walls too.

The problem with the first two options is that driving these monitors consumes
two crtcs.  If we present them as a single output to applications, they'll make
the assumption that they can just assign a single crtc to that output and use
the remaining crtcs for something else.  I suspect that deleting crtcs or
otherwise marking them as used as a side effect of setting a mode on a different
crtc is going to explode a lot of existing applications.

~~

Regardless of what we do about the current crop of 4k monitors, one feature I
would like to add is a standardized OutputGroup property.  Multiple outputs with
the same value of OutputGroup should be considered (both by clients and the
server) as a single logical monitor.  This would affect the Xinerama information
presented by rrxinerama.c, and window managers that use RandR 1.2 directly would
be encouraged to consider output groups in their UI behavior.

The X server could configure OutputGroups automatically when setting up the
initial configuration based on the presence of tiled displays, and clients could
reconfigure the groups at runtime to get different behavior if desired.

Does this sound like a reasonable extension to RandR?



___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-23 Thread Michal Suchanek
On 23 January 2014 05:24, Alexander E. Patrakov  wrote:
> 2014/1/23 Keith Packard :
>> Andy Ritger  writes:
>>

>>> * How should hotplug of the monitor's second tile be handled by the
>>>   server if it is hiding the two tiles?  Should such a hotplug generate
>>>   a connected event to RandR clients?  Maybe a hotplug on either tile
>>>   gets reported as a connect event on the one API-facing output.
>>
>> Presumably you'd only want to report 'connected' if both wires were
>> hooked up? Otherwise, the monitor isn't really useful.
>
> In the pathological sitiation when only one wire is hooked up, I guess
> that the monitor is still usable at its original size and aspect
> ratio, just not at full resolution. Someone has to verify this. If
> this is so, then "unplug one 1920x1080 monitors, plug two mirrored 4K
> monitors" sounds more appropriate. Or, if the fact that the other half
> exists is detectable from EDID, we can indeed not support single-wire
> operation.
>

Why is this pathological?

Does the screen fail when only one input is connected or the two
inputs are connected to different machines?

I don't have a tiled monitor but any monitor with more than one input
I ever had I ended up using with more than one computer at some time
because I tend to have more computers than monitors around.

Thanks

Michal
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: How do we want to deal with 4k tiled displays?

2014-01-23 Thread Alexander E. Patrakov
2014/1/23 Alexander E. Patrakov :
> Keith Packard wrote:

>> and it may well confuse applications into thinking that they can
>> configure the two "monitors" separately.

...

> Another class of clients that attempt to reconfigure screens is fullscreen
> games, and here breakage is indeed not allowed at all. But we can try running
> them with two mirrored 1920x1080 monitors right now, attempt to select a lower
> resolution in the game (or just run a game that insists on the lower
> resolution) and see what breaks and what is not fixable manually by running
> xrandr to configure the second monitor identically to the first one. I will 
> test
> this later today and report.

OK, tried under xfce at home with two games and failed in an unexpected way.

Don't starve (SDL2-based): by default, it runs fine in the cloned
mode. However, if I attempt to change resolution, it just does nothing
and returns to 1920x1080.

Baldur's Gate (under Wine): sets the eDP1 output to 640x480, and thus
appears in the corner of HDMI1. If I change the HDMI1 resolution to
640x480 behind its back, too, then it is cloned correctly and works.

So, although the games clearly behave in a non-optimal way, it is not
something that would break them completely in the proposed "fake
mirror" setup. This does not cancel my previous remark that the fake
mirror setup may be pointless if Aaron's worries are also already true
on ivybridge.

OTOH, in non-clone mode, Don't starve always appears on the leftmost
monitor (and the setting to choose the display doesn't work). Baldur's
Gate always appears on eDP1. So in "real" multimonitor configurations
both games are broken, but this is off-topic for this thread.

-- 
Alexander E. Patrakov
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: How do we want to deal with 4k tiled displays?

2014-01-22 Thread Alexander E. Patrakov
Keith Packard wrote:
> "Alexander E. Patrakov"  writes:
> > What's wrong with my proposal to report "mirrored screens" to clients
> > even though the outputs are not really mirrors? In this case, each
> > mirror can get one EDID, implementing option (1).
> 
> We'd have to report both at the full resolution so that applications
> would maximize correctly; I don't see how this really helps here, other
> than potentially confusing applications. It leaves the EDID problem
> unsolved (you don't have an EDID which reflects the unified display),
> and it may well confuse applications into thinking that they can
> configure the two "monitors" separately.

As for the "you don't have an EDID which reflects the unified display", that's 
absolutely correct. But my opinion is that it doesn't really exist (thus there 
is no problem to solve), so I'd like to see some arguments that show that it 
is needed.

Now about the objection about the independent configuration of "mirrors". First 
of all, it looks valid and is testable. But, as a user, I would expect (and 
tolerate) the existing screen configuration tools to break in such tiled 4K 
setup - after all, this setup cannot be fully expressed in the language they 
understand. Also I would welcome attempts to limit such breakage to the 
necessary minimum.

We could probably deal with that by mirroring the configuration changes done on 
one mirror to the other inside the X server, just as if another client did 
that matching change. After all, this (another tool reconfiguring outputs) can 
happen anyway with today's hardware and software, and configuration tools are 
already supposed to be able to deal with it. I don't have two monitors here at 
work, but at least KDE's kcm_randr correctly updates itself if I change the 
resolution using xrandr behind its back.

Another class of clients that attempt to reconfigure screens is fullscreen 
games, and here breakage is indeed not allowed at all. But we can try running 
them with two mirrored 1920x1080 monitors right now, attempt to select a lower 
resolution in the game (or just run a game that insists on the lower 
resolution) and see what breaks and what is not fixable manually by running 
xrandr to configure the second monitor identically to the first one. I will 
test 
this later today and report.

Given the above and the following words from Aaron Plattner (unfortunately, 
not testable, unlike the above), I think that further discussion is needed.

Aaron Plattner wrote:

> If we present them as a single output to applications, they'll make
> the assumption that they can just assign a single crtc to that output and
> use the remaining crtcs for something else.  I suspect that deleting crtcs
> or otherwise marking them as used as a side effect of setting a mode on a
> different crtc is going to explode a lot of existing applications.

OTOH, as you wrote earlier, "RandR is always allowed to say 'no' to any 
particular configuration". So do I understand correctly that any client that 
breaks is already broken e.g. with ivybridge? If this is so, then indeed, a 
fake mirror that I proposed is somewhat pointless.

Further in your mail, you wrote:

> Oh, I can imagine advertising the dual-wire setup as a separate output?
> Would that be helpful at all?
> ...
> We used to just have N outputs to 1 CRTC; it looks like we've got
> N outputs to M CRTCs...

That looks more like a question addressed to Aaron Plattner.

-- 
Alexander E. Patrakov
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: How do we want to deal with 4k tiled displays?

2014-01-22 Thread Keith Packard
"Alexander E. Patrakov"  writes:

> What would be the case where the driver would want to do things
> differently?

It's always nice to provide the option?

>>  1) Report both EDIDs, presumably using some new convention
>>  2) Construct a fake unified EDID
>>  3) Don't report EDID at all
>>
>> Obviously 3) is the easiest :-)
>
> (3) breaks color managers, because they rely upon the EDID to
> associate color profiles with outputs.

Yeah, it's sub-optimal. 

> What's wrong with my proposal to report "mirrored screens" to clients
> even though the outputs are not really mirrors? In this case, each
> mirror can get one EDID, implementing option (1).

We'd have to report both at the full resolution so that applications
would maximize correctly; I don't see how this really helps here, other
than potentially confusing applications. It leaves the EDID problem
unsolved (you don't have an EDID which reflects the unified display),
and it may well confuse applications into thinking that they can
configure the two "monitors" separately.

> In the pathological sitiation when only one wire is hooked up, I guess
> that the monitor is still usable at its original size and aspect
> ratio, just not at full resolution. Someone has to verify this. If
> this is so, then "unplug one 1920x1080 monitors, plug two mirrored 4K
> monitors" sounds more appropriate. Or, if the fact that the other half
> exists is detectable from EDID, we can indeed not support single-wire
> operation.

I'd love to know if this works, in which case we really can mostly
ignore the problem. Of course, as you plug in the cables, you're going
to see a bunch of mode flipping.

Oh, I can imagine advertising the dual-wire setup as a separate output?
Would that be helpful at all?

> +1, but I repeat that mirrors would work here, too, and nicely account
> for busy CRTCs.

Yeah, that's a benefit. I still think the potential for confusing other
applications is higher with mirroring though.

> +1 to "separate problem". Indeed, there is one problem of making
> existing applications work without changing the protocol, and one
> problem of extending the protocol so that interested clients can learn
> about combined displays or make power walls on the fly.

We used to just have N outputs to 1 CRTC; it looks like we've got
N outputs to M CRTCs...

-- 
keith.pack...@intel.com


pgpFbXskA1l9M.pgp
Description: PGP signature
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-22 Thread Alexander E. Patrakov
2014/1/23 Keith Packard :
> Andy Ritger  writes:
>
>> Keith, did you mean to say "driver" or "X server"?
>
> Well, I meant to say 'driver', but I can see reasons for wanting to at
> least have some code in hw/xfree86/modes that could help out. In any
> case, definitely within the X server, but beyond that I'd say we should
> make as much code common as possible.
>
>> The case of connectors sharing DP lanes seems like a hardware-specific
>> constraint best arbitrated within the hardware's driver.  But these tiled
>> monitors requiring multiple CRTCs+outputs doesn't seem hardware-specific,
>> so arguably doesn't belong in a hardware-specific driver.  At the least,
>> it would be unfortunate if each driver chose to solve this configuration
>> differently.
>
> Right, which is where a helper function might be a better solution, in
> case the driver did want to do things differently.

What would be the case where the driver would want to do things differently?

>> But, even if we hide the two tiles within the X server, rather than
>> within drivers, there would be behavioral quirks.  E.g.,
>>
>> * The EDID of each tile indicates the modetimings for that tile (as well
>>   as the physical position of the tile within the whole monitor).  When we
>>   provide the EDID to RandR clients through the EDID output property,
>>   which tile's EDID should we provide?  Or should we construct a fake
>>   EDID that describes the combined resolution?  Maybe in practice no
>>   RandR clients care about this information.
>
> Interesting. Sounds like we have three choices:
>
>  1) Report both EDIDs, presumably using some new convention
>  2) Construct a fake unified EDID
>  3) Don't report EDID at all
>
> Obviously 3) is the easiest :-)

(3) breaks color managers, because they rely upon the EDID to
associate color profiles with outputs.

What's wrong with my proposal to report "mirrored screens" to clients
even though the outputs are not really mirrors? In this case, each
mirror can get one EDID, implementing option (1).

>> * How should hotplug of the monitor's second tile be handled by the
>>   server if it is hiding the two tiles?  Should such a hotplug generate
>>   a connected event to RandR clients?  Maybe a hotplug on either tile
>>   gets reported as a connect event on the one API-facing output.
>
> Presumably you'd only want to report 'connected' if both wires were
> hooked up? Otherwise, the monitor isn't really useful.

In the pathological sitiation when only one wire is hooked up, I guess
that the monitor is still usable at its original size and aspect
ratio, just not at full resolution. Someone has to verify this. If
this is so, then "unplug one 1920x1080 monitors, plug two mirrored 4K
monitors" sounds more appropriate. Or, if the fact that the other half
exists is detectable from EDID, we can indeed not support single-wire
operation.

>
>> Also, there are a variety of other scenarios where one large virtual
>> monitor has multiple tiles of input: powerwalls, multi-projector
>> setups, etc.  In these cases, users already complain about windows
>> getting maximized to only one output, or fullscreen applications only
>> setting a mode on one of the outputs.
>
> Which is why we must synthesize a single output.

+1, but I repeat that mirrors would work here, too, and nicely account
for busy CRTCs.

>> Those installations are admittedly niche and generally have savvy
>> administrators who can beat a configuration into submission, while the
>> tiled 4k monitors are coming to the average user.  Still, it seems like
>> both tiled 4k monitors and powerwalls present the same general problem,
>> so it would be nice if we can solve them with one general solution.
>>
>> I think I lean slightly towards trying to handle this client-side.
>
> I don't see how this will work as we have multiple RandR bindings now,
> and one (XCB) is explicitly very low-level. We'd have to interpose a new
> library into the system and convert all applications to using that. I
> think it'd be a whole lot easier to do this in the X server.
>
>> It seems like that information could be conveyed through Aaron's
>> OutputGroup idea.  Maybe that is too much detail for clients, but
>> maybe we could have a client-side library (either added to libXrandr
>> or something new) that can abstract the details from clients who prefer
>> to use that than to have full flexibility themselves.
>
> Much as core clients can't see multiple RandR outputs today, and instead
> use the screen geometry directly, I think we have to make existing RandR
> aware applications "work" reasonably with the current protocol, which
> means synthesizing a large output out of multiple smaller outputs. If
> you want to *also* extend the RandR protocol so that smarter clients can
> drill through that large output and see the individual monitors, that
> sounds like a separable problem.

+1 to "separate problem". Indeed, there is one problem of making
existing applications work without changing t

Re: How do we want to deal with 4k tiled displays?

2014-01-22 Thread Keith Packard
Andy Ritger  writes:

> Keith, did you mean to say "driver" or "X server"?

Well, I meant to say 'driver', but I can see reasons for wanting to at
least have some code in hw/xfree86/modes that could help out. In any
case, definitely within the X server, but beyond that I'd say we should
make as much code common as possible.

> The case of connectors sharing DP lanes seems like a hardware-specific
> constraint best arbitrated within the hardware's driver.  But these tiled
> monitors requiring multiple CRTCs+outputs doesn't seem hardware-specific,
> so arguably doesn't belong in a hardware-specific driver.  At the least,
> it would be unfortunate if each driver chose to solve this configuration
> differently.

Right, which is where a helper function might be a better solution, in
case the driver did want to do things differently.

> But, even if we hide the two tiles within the X server, rather than
> within drivers, there would be behavioral quirks.  E.g.,
>
> * The EDID of each tile indicates the modetimings for that tile (as well
>   as the physical position of the tile within the whole monitor).  When we
>   provide the EDID to RandR clients through the EDID output property,
>   which tile's EDID should we provide?  Or should we construct a fake
>   EDID that describes the combined resolution?  Maybe in practice no
>   RandR clients care about this information.

Interesting. Sounds like we have three choices:

 1) Report both EDIDs, presumably using some new convention
 2) Construct a fake unified EDID
 3) Don't report EDID at all

Obviously 3) is the easiest :-)

> * How should hotplug of the monitor's second tile be handled by the
>   server if it is hiding the two tiles?  Should such a hotplug generate
>   a connected event to RandR clients?  Maybe a hotplug on either tile
>   gets reported as a connect event on the one API-facing output.

Presumably you'd only want to report 'connected' if both wires were
hooked up? Otherwise, the monitor isn't really useful.

> Also, there are a variety of other scenarios where one large virtual
> monitor has multiple tiles of input: powerwalls, multi-projector
> setups, etc.  In these cases, users already complain about windows
> getting maximized to only one output, or fullscreen applications only
> setting a mode on one of the outputs.

Which is why we must synthesize a single output.

> Those installations are admittedly niche and generally have savvy
> administrators who can beat a configuration into submission, while the
> tiled 4k monitors are coming to the average user.  Still, it seems like
> both tiled 4k monitors and powerwalls present the same general problem,
> so it would be nice if we can solve them with one general solution.
>
> I think I lean slightly towards trying to handle this client-side.

I don't see how this will work as we have multiple RandR bindings now,
and one (XCB) is explicitly very low-level. We'd have to interpose a new
library into the system and convert all applications to using that. I
think it'd be a whole lot easier to do this in the X server.

> It seems like that information could be conveyed through Aaron's
> OutputGroup idea.  Maybe that is too much detail for clients, but
> maybe we could have a client-side library (either added to libXrandr
> or something new) that can abstract the details from clients who prefer
> to use that than to have full flexibility themselves.

Much as core clients can't see multiple RandR outputs today, and instead
use the screen geometry directly, I think we have to make existing RandR
aware applications "work" reasonably with the current protocol, which
means synthesizing a large output out of multiple smaller outputs. If
you want to *also* extend the RandR protocol so that smarter clients can
drill through that large output and see the individual monitors, that
sounds like a separable problem.

> Granted, clients would probably hate this idea...

And, if clients hate it, they'll drag their feet and you won't get
anything usable until the 2-wire monitors go away. It's not a terrible
plan at that :-)

-- 
keith.pack...@intel.com


pgp2EVBvu9liR.pgp
Description: PGP signature
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-22 Thread Andy Ritger
On Sun, Jan 19, 2014 at 01:26:07PM -0800, Keith Packard wrote:
> 
> Aaron Plattner  writes:
> 
> >  1. Hide the presence of the second tile in the X server.
> >
> > Somehow combine the two tiles into a single logical output at the RandR
> > protocol level.  The X server would be responsible for setting up the 
> > right
> > configuration to drive the logical output using the correct physical
> > resources.
> 
> This is effectively what we do with the wacky ivybridge 3-output
> setup. With that chipset, there are 4 DP lanes available and two
> connectors. If you suck up all 4 lanes for the first connector, then you
> have none left for the second one. If you've already configured the
> second one, then you can't use the higher resolution modes on the first
> one. The only way to tell is to try and see what RandR says.
> 
> RandR is always allowed to say 'no' to any particular configuration, and
> in the tiled case, I'd suggest that the correct approach would be to
> have the driver pretend that the monitor is connected to only one of the
> outputs, and that configuring the 4k mode would require that sufficient
> resources be available to drive both physical links.

Keith, did you mean to say "driver" or "X server"?

The case of connectors sharing DP lanes seems like a hardware-specific
constraint best arbitrated within the hardware's driver.  But these tiled
monitors requiring multiple CRTCs+outputs doesn't seem hardware-specific,
so arguably doesn't belong in a hardware-specific driver.  At the least,
it would be unfortunate if each driver chose to solve this configuration
differently.

But, even if we hide the two tiles within the X server, rather than
within drivers, there would be behavioral quirks.  E.g.,

* The EDID of each tile indicates the modetimings for that tile (as well
  as the physical position of the tile within the whole monitor).  When we
  provide the EDID to RandR clients through the EDID output property,
  which tile's EDID should we provide?  Or should we construct a fake
  EDID that describes the combined resolution?  Maybe in practice no
  RandR clients care about this information.

* How should hotplug of the monitor's second tile be handled by the
  server if it is hiding the two tiles?  Should such a hotplug generate
  a connected event to RandR clients?  Maybe a hotplug on either tile
  gets reported as a connect event on the one API-facing output.

Also, there are a variety of other scenarios where one large virtual
monitor has multiple tiles of input: powerwalls, multi-projector
setups, etc.  In these cases, users already complain about windows
getting maximized to only one output, or fullscreen applications only
setting a mode on one of the outputs.

Those installations are admittedly niche and generally have savvy
administrators who can beat a configuration into submission, while the
tiled 4k monitors are coming to the average user.  Still, it seems like
both tiled 4k monitors and powerwalls present the same general problem,
so it would be nice if we can solve them with one general solution.

I think I lean slightly towards trying to handle this client-side.

If the goals are:

* Configuration utilities know which outputs are part of the same monitor 
  (and the physical orientation), to make intelligent decisions about 
  output layout.

* Window managers know which outputs are part of the same monitor, 
  to make intelligent maximize and snap-to-edge behavior.

* Fullscreen applications know which outputs are part of the same monitor, 
  to make intelligent modesetting decisions.

It seems like that information could be conveyed through Aaron's
OutputGroup idea.  Maybe that is too much detail for clients, but
maybe we could have a client-side library (either added to libXrandr
or something new) that can abstract the details from clients who prefer
to use that than to have full flexibility themselves.

Granted, clients would probably hate this idea...

- Andy


> -- 
> keith.pack...@intel.com
> 
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


Re: How do we want to deal with 4k tiled displays?

2014-01-19 Thread Keith Packard
Aaron Plattner  writes:

>  1. Hide the presence of the second tile in the X server.
>
> Somehow combine the two tiles into a single logical output at the RandR
> protocol level.  The X server would be responsible for setting up the 
> right
> configuration to drive the logical output using the correct physical
> resources.

This is effectively what we do with the wacky ivybridge 3-output
setup. With that chipset, there are 4 DP lanes available and two
connectors. If you suck up all 4 lanes for the first connector, then you
have none left for the second one. If you've already configured the
second one, then you can't use the higher resolution modes on the first
one. The only way to tell is to try and see what RandR says.

RandR is always allowed to say 'no' to any particular configuration, and
in the tiled case, I'd suggest that the correct approach would be to
have the driver pretend that the monitor is connected to only one of the
outputs, and that configuring the 4k mode would require that sufficient
resources be available to drive both physical links.

-- 
keith.pack...@intel.com


pgp6SwdyPM3I3.pgp
Description: PGP signature
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-17 Thread Jasper St. Pierre
Is there a reason we can't somehow "merge" the two CRTCs together into one
virtual CRTC? Too difficult to modeset?



On Thu, Jan 16, 2014 at 2:11 PM, Aaron Plattner wrote:

> So, monitor manufacturers are starting to make high-resolution displays
> that
> consist of one LCD panel that appears to the PC as two.  The one I've got
> is a
> Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2 multistream
> devices
> that have the same GUID but different EDIDs.  There's an extension block
> in the
> EDID that's supposed to indicate which side is the left tile and which is
> the
> right, though I haven't tried to decode it yet.
>
> The problem, obviously, is that applications (including some games) treat
> the
> two tiles as if they were completely separate monitors.  Windows maximize
> to
> only half of the screen.  My question is, how do we want to deal with these
> monitors?
>
> As far as I see it, we have four options:
>
>  1. Hide the presence of the second tile in the X server.
>
> Somehow combine the two tiles into a single logical output at the RandR
> protocol level.  The X server would be responsible for setting up the
> right
> configuration to drive the logical output using the correct physical
> resources.
>
>  2. Hide the presence of the second tile in libXrandr.
>
> This would allow interested applications to query the real state of the
> hardware while also making it easier to do modesets on a per-monitor
> level
> rather than per-output.
>
> This could be exposed either as a new "simple" modeset API in
> libXrandr or
> similar, or by modifying the existing interface and having a new
> interface
> to punch through the façade and get at the real configuration, for
> clients
> that care.
>
>  3. Update every application that uses RandR 1.2.
>
> Applications can detect the presence of these monitors and deal with
> them
> themselves, but this might have poor adoption because programmers are
> a lazy
> bunch in general.
>
>  4. Do nothing and hope the problem goes away.
>
> Hopefully, the situation with current 4k monitors is temporary and
> we'll
> start seeing single-tile 4k displays soon, fixing the problem
> "forever".
> Until we get 8k tiled displays.
>
> If the real output devices are still exposed through the protocol, it
> might make
> sense to add new properties describing their relative positions to make it
> easier for clients to lay them out in the right order.  This might be
> useful for
> power-walls too.
>
> The problem with the first two options is that driving these monitors
> consumes
> two crtcs.  If we present them as a single output to applications, they'll
> make
> the assumption that they can just assign a single crtc to that output and
> use
> the remaining crtcs for something else.  I suspect that deleting crtcs or
> otherwise marking them as used as a side effect of setting a mode on a
> different
> crtc is going to explode a lot of existing applications.
>
> ~~
>
> Regardless of what we do about the current crop of 4k monitors, one
> feature I
> would like to add is a standardized OutputGroup property.  Multiple
> outputs with
> the same value of OutputGroup should be considered (both by clients and the
> server) as a single logical monitor.  This would affect the Xinerama
> information
> presented by rrxinerama.c, and window managers that use RandR 1.2 directly
> would
> be encouraged to consider output groups in their UI behavior.
>
> The X server could configure OutputGroups automatically when setting up the
> initial configuration based on the presence of tiled displays, and clients
> could
> reconfigure the groups at runtime to get different behavior if desired.
>
> Does this sound like a reasonable extension to RandR?
>
> --
> Aaron
> ___
> xorg-devel@lists.x.org: X.Org development
> Archives: http://lists.x.org/archives/xorg-devel
> Info: http://lists.x.org/mailman/listinfo/xorg-devel




-- 
  Jasper
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel

Re: How do we want to deal with 4k tiled displays?

2014-01-17 Thread Alexander E. Patrakov
Aaron Plattner wrote:
> So, monitor manufacturers are starting to make high-resolution displays that
> consist of one LCD panel that appears to the PC as two.  The one I've got
> is a Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2
> multistream devices that have the same GUID but different EDIDs.  There's
> an extension block in the EDID that's supposed to indicate which side is
> the left tile and which is the right, though I haven't tried to decode it
> yet.


Thanks for disclosing the exact model information


> The problem, obviously, is that applications (including some games) treat
> the two tiles as if they were completely separate monitors.  Windows
> maximize to only half of the screen.  My question is, how do we want to
> deal with these monitors?
> 
> As far as I see it, we have four options:
> 
>  1. Hide the presence of the second tile in the X server.
> 
> Somehow combine the two tiles into a single logical output at the RandR
> protocol level.  The X server would be responsible for setting up the
> right configuration to drive the logical output using the correct physical
> resources.
> 
>  2. Hide the presence of the second tile in libXrandr.
> 
> This would allow interested applications to query the real state of the
> hardware while also making it easier to do modesets on a per-monitor
> level rather than per-output.
> 
> This could be exposed either as a new "simple" modeset API in libXrandr
> or similar, or by modifying the existing interface and having a new
> interface to punch through the façade and get at the real configuration,
> for clients that care.
> 
>  3. Update every application that uses RandR 1.2.
> 
> Applications can detect the presence of these monitors and deal with
> them themselves, but this might have poor adoption because programmers are
> a lazy bunch in general.
> 
>  4. Do nothing and hope the problem goes away.
> 
> Hopefully, the situation with current 4k monitors is temporary and we'll
> start seeing single-tile 4k displays soon, fixing the problem "forever".
> Until we get 8k tiled displays.
> 
> If the real output devices are still exposed through the protocol, it might
> make sense to add new properties describing their relative positions to
> make it easier for clients to lay them out in the right order.  This might
> be useful for power-walls too.

Disclaimer: I am just a user, not a developer of anything related to graphics.

Based on the above, I think that the following summarizes the requirements:

1. Games must think that there is a single 4k screen they can maximize to. Key 
idea: mirrors would work, too!

2. Information that there are two CRTCs consumed should be also available to 
the clients.

3. There must be a way to configure several existing frameless displays as a 
single power wall, i.e. erase boundaries between them.

4. The X server must erase boundaries automatically for devices that only 
pretend to consist of two or more parts.

So my proposal would be to add a new request that marks some existing outputs 
as a power wall. As a result of this request, the list of modes they support 
should be replaced with one single fixed mode - that of the resulting power 
wall, and boundaries should be erased. For compatibility with the existing 
applications, all these outputs should pretend that they are mirrors of each 
other (and indeed it currently makes sense to run a game on mirrored 
displays). Relative positions of outputs inside the power wall should not be 
available through xrandr 1.2 protocol. Attempts to set them to anything else 
than mirrors should fail.

So here is some mockup of the resulting xrandr output:

Screen 0: minimum 3840x2160, current 3840x2160, maximum 8192 x 8192
DP1 connected 3840x2160+0+0 (normal left inverted right x axis y axis) 509mm x 
286mm
   3840x2160 60.0*+
DP2 connected 3840x2160+0+0 (normal left inverted right x axis y axis) 509mm x 
286mm
   3840x2160 60.0*+

even though each half is in fact 1920x2160.

-- 
Alexander E. Patrakov
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel


How do we want to deal with 4k tiled displays?

2014-01-16 Thread Aaron Plattner
So, monitor manufacturers are starting to make high-resolution displays that
consist of one LCD panel that appears to the PC as two.  The one I've got is a
Dell UP2414Q.  It shows up to the PC as two DisplayPort 1.2 multistream devices
that have the same GUID but different EDIDs.  There's an extension block in the
EDID that's supposed to indicate which side is the left tile and which is the
right, though I haven't tried to decode it yet.

The problem, obviously, is that applications (including some games) treat the
two tiles as if they were completely separate monitors.  Windows maximize to
only half of the screen.  My question is, how do we want to deal with these
monitors?

As far as I see it, we have four options:

 1. Hide the presence of the second tile in the X server.

Somehow combine the two tiles into a single logical output at the RandR
protocol level.  The X server would be responsible for setting up the right
configuration to drive the logical output using the correct physical
resources.

 2. Hide the presence of the second tile in libXrandr.

This would allow interested applications to query the real state of the
hardware while also making it easier to do modesets on a per-monitor level
rather than per-output.

This could be exposed either as a new "simple" modeset API in libXrandr or
similar, or by modifying the existing interface and having a new interface
to punch through the façade and get at the real configuration, for clients
that care.

 3. Update every application that uses RandR 1.2.

Applications can detect the presence of these monitors and deal with them
themselves, but this might have poor adoption because programmers are a lazy
bunch in general.

 4. Do nothing and hope the problem goes away.

Hopefully, the situation with current 4k monitors is temporary and we'll
start seeing single-tile 4k displays soon, fixing the problem "forever".
Until we get 8k tiled displays.

If the real output devices are still exposed through the protocol, it might make
sense to add new properties describing their relative positions to make it
easier for clients to lay them out in the right order.  This might be useful for
power-walls too.

The problem with the first two options is that driving these monitors consumes
two crtcs.  If we present them as a single output to applications, they'll make
the assumption that they can just assign a single crtc to that output and use
the remaining crtcs for something else.  I suspect that deleting crtcs or
otherwise marking them as used as a side effect of setting a mode on a different
crtc is going to explode a lot of existing applications.

~~

Regardless of what we do about the current crop of 4k monitors, one feature I
would like to add is a standardized OutputGroup property.  Multiple outputs with
the same value of OutputGroup should be considered (both by clients and the
server) as a single logical monitor.  This would affect the Xinerama information
presented by rrxinerama.c, and window managers that use RandR 1.2 directly would
be encouraged to consider output groups in their UI behavior.

The X server could configure OutputGroups automatically when setting up the
initial configuration based on the presence of tiled displays, and clients could
reconfigure the groups at runtime to get different behavior if desired.

Does this sound like a reasonable extension to RandR?

-- 
Aaron
___
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel