Re: HDR support in Wayland/Weston

2019-03-12 Thread Graeme Gill
Michel Dänzer wrote:

> It was never reliable for that. Other clients using any of those
> mechanisms could always interfere, at least for the RandR compatibility
> output.

I disagree. It was reliable in the sense that running the
profile loader set it to a known state, irrespective
of whatever other applications may have done via
other API's. With the behavior changed to combine
all the API settings, there is no simple way to
set it to a known state.

> To make all these mechanisms work reliably and consistently at all times.

Except it has the opposite effect in a color management sense !

> If you have specific suggestions, please post them to the xorg-devel
> mailing lists or create a merge request at
> https://gitlab.freedesktop.org/xorg/xserver/merge_requests .

Fair enough.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Enabling HDR support in Wayland / Weston

2019-03-11 Thread Sharma, Shashank
Hello All,

We have raised a new merge request, for the HDR implementation in Weston:
https://gitlab.freedesktop.org/wayland/weston/merge_requests/124

This patch series is in continuation to the design we published here:
https://lists.freedesktop.org/archives/wayland-devel/2019-January/039808.html

We have added a basic and limited focus implementation of HDR video playback 
stack, some of the testing specs are:

-HDR Format: P010

-Some of the videos tested: 
https://4kmedia.org/samsung-wonderland-two-hdr-uhd-4k-demo ; and others from 
the same site.

-Display / Overlays HW: Icelake

-Monitors: LG 32UD99-W / 27UK650-W

This patch-set targets:
- HDR framework in drm-backend, which will allow a single plane HDR playback 
using Icelake HW overlays.
- HDR support in Weston's GL backend, which will allow GL based rendering for 
HDR playback as a fallback method for other Hardwares.

The Wayland protocol for HDR/Color management is already being discussed at 
various forums and lists.
Please note that, this patch-set contains a temporary implementation of HDR 
metadata and colorspace protocol, which is under tag do-not-merge, and is for 
the sake of completion of testing stack. We are using this as a placeholder for 
actual color management protocol.
https://lists.freedesktop.org/archives/wayland-devel/2019-March/040264.html

To give everyone some background of these parallel threads of work, in my team, 
we have divided the total HDR development work into 3 parts:
-  The HDR protocol development as a color management subset: (Driven by Ankit 
Nautiyal)
-  Adding HDR support in GL backend : (Driven by Harish Krupo)
-  Compositor and framework changes in DRM-backend : (Driven by me, Shashank 
Sharma)

Please provide us the feedback on the implementation.

Regards
Shashank

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-11 Thread Pekka Paalanen
On Sat, 9 Mar 2019 16:02:44 -0700
Chris Murphy  wrote:

> Call me crazy but I'd like to think GTK or Qt could provide, if they
> so chose, this abstraction such that a display profiling app could run
> on any platform with any compositor, and calibrate or profile. These
> higher level APIs should detect the compositor being used, and tell
> the compositor to do the right thing for the task at hand.

I cannot speak for them, but they certainly could if they can design an
API to expose. Not by detecting the compositor per se though, but
adapting to the Wayland and other interfaces they can discover.

I believe Qt may even have a DRM KMS backend, too, so you are already
able to run some Qt apps without any display server straight on KMS.

https://doc.qt.io/qt-5/embedded-linux.html mentions KMS and libinput
with eglfs.


Thanks,
pq


pgpYGNDzE36WP.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-09 Thread Chris Murphy
On Fri, Mar 8, 2019 at 3:31 AM Pekka Paalanen  wrote:
>
> On Fri, 8 Mar 2019 08:35:20 +1100
> Graeme Gill  wrote:
>
> > Michel Dänzer wrote:
> >
> > > It sounds like KMS leases could be a pretty good fit for a calibration
> > > application. It can lease each output individually from the Wayland
> > > compositor and fully control it using KMS APIs, while the Wayland
> > > compositor continues running normally on other outputs.
> >
> > There seems to be this idea that has got a hold amongst many commentators
> > on this topic here, that somehow the display calibration and profiling
> > application NEEDs raw and direct access to the a display to operate.
>
> Hi Graeme,
>
> that very idea stems from the early Wayland color management
> discussions where it was strictly demanded that the application must be
> able to completely bypass the compositor and own the hardware to do its
> job correctly, because there is no trusting that a compositor could get
> color management right. That is how it essentially was on X11 since the
> X server did not second-guess or mangle application commands or images
> and it did more or less expose the hardware of its time as is, letting
> all the applications fight among each other for control.
>
> I'm happy to see the original demand been mostly replaced, but
> apparently it was so strongly underlined that understanding how much of
> it is actually necessary is hard.

It is an anti-historical demand that the compositor must be bypassed
for calibration and characterization. Since the first such
applications appeared on System 7 with color QuickDraw, there was the
exact same compositor in place for the "display profiling application"
(the application that does calibration+characterization+verification
and builds an ICC profile from the characterization and then installs
and registers it with the OS) as any other application. This is the
same today on Windows, with the DWM compositor, and macOS with the
Quartz compositor, and they can't be disabled.

> DRM leases are the modern idea for completely bypassing the compositor /
> display server, and taking full control of the relevant part of the
> display hardware yourself. DRM leases are driven by virtual reality
> (VR) use cases for head-mounted displays (HMD), where the VR
> application (or a VR compositor that is separate from the desktop
> compositor due to hard realtime requirements) will be driving the HMD
> directly by kernel UAPI (KMS) and it makes no sense to share the HMD
> with anything else at the same time.
>
> If we look at APIs, DRM KMS API is universal on Linux. DRM KMS is more
> universal on Linux than Wayland, X11, or others, or any toolkit API,
> because DRM KMS is *the* way to control displays at the lowest level
> userspace can have access. KMS is hardware agnostic, but it does expose
> hardware-specific features through a common API.



> That said, personally I do think there is a good place for a Wayland
> protocol extension designed for color
> measurement/characterisation/calibration applications (is there a
> shorter term I could use for all those apps?):

display profiling app *shrug*


>
> - It keeps the Wayland compositor in the loop, meaning that you are
>   sure to reset the hardware state correctly for measurement, instead
>   of the measurement application having to be updated to know how to
>   reset everything the compositor might have been using, e.g. setting
>   just one LUT vs. two LUTs and a matrix in the hardware.
>
> - It allows a measurement app to cooperate with other apps without
>   being stomped on or having to shut down everything else while it runs.
>
> - It could allow uploading a new color profile to the compositor, if
>   various compositor projects can agree on how to do that. Given that
>   ICC spec exists, I guess there are good chances of succeeding. OTOH,
>   desktop projects do tend to dislike any interfaces that attempt to
>   bypass their own settings apps.
>
> - It does offer some amount of API abstraction
>
> However, the extension will be specific to Wayland so you still have a
> whole new platform to support in your color tool(kit)s.
>

Call me crazy but I'd like to think GTK or Qt could provide, if they
so chose, this abstraction such that a display profiling app could run
on any platform with any compositor, and calibrate or profile. These
higher level APIs should detect the compositor being used, and tell
the compositor to do the right thing for the task at hand.

Windows and macOS have one display pipeline and compositor. There are
many on Linux. Even multiple wayland compositors. And they're
potentially used in combination on top of each other. I'm thinking of
Qubes OS where each application runs in a VM. What about flatpaks and
snap applications? The idea each net pipeline is different and would
need to be characterized, doesn't sound workable.


-- 
Chris Murphy
___
wayland-devel mailing list

Re: HDR support in Wayland/Weston

2019-03-08 Thread Pekka Paalanen
On Fri, 8 Mar 2019 08:35:20 +1100
Graeme Gill  wrote:

> Michel Dänzer wrote:
> 
> > It sounds like KMS leases could be a pretty good fit for a calibration
> > application. It can lease each output individually from the Wayland
> > compositor and fully control it using KMS APIs, while the Wayland
> > compositor continues running normally on other outputs.  
> 
> There seems to be this idea that has got a hold amongst many commentators
> on this topic here, that somehow the display calibration and profiling
> application NEEDs raw and direct access to the a display to operate.

Hi Graeme,

that very idea stems from the early Wayland color management
discussions where it was strictly demanded that the application must be
able to completely bypass the compositor and own the hardware to do its
job correctly, because there is no trusting that a compositor could get
color management right. That is how it essentially was on X11 since the
X server did not second-guess or mangle application commands or images
and it did more or less expose the hardware of its time as is, letting
all the applications fight among each other for control.

I'm happy to see the original demand been mostly replaced, but
apparently it was so strongly underlined that understanding how much of
it is actually necessary is hard.

DRM leases are the modern idea for completely bypassing the compositor /
display server, and taking full control of the relevant part of the
display hardware yourself. DRM leases are driven by virtual reality
(VR) use cases for head-mounted displays (HMD), where the VR
application (or a VR compositor that is separate from the desktop
compositor due to hard realtime requirements) will be driving the HMD
directly by kernel UAPI (KMS) and it makes no sense to share the HMD
with anything else at the same time.

If we look at APIs, DRM KMS API is universal on Linux. DRM KMS is more
universal on Linux than Wayland, X11, or others, or any toolkit API,
because DRM KMS is *the* way to control displays at the lowest level
userspace can have access. KMS is hardware agnostic, but it does expose
hardware-specific features through a common API.

That said, personally I do think there is a good place for a Wayland
protocol extension designed for color
measurement/characterisation/calibration applications (is there a
shorter term I could use for all those apps?):

- It keeps the Wayland compositor in the loop, meaning that you are
  sure to reset the hardware state correctly for measurement, instead
  of the measurement application having to be updated to know how to
  reset everything the compositor might have been using, e.g. setting
  just one LUT vs. two LUTs and a matrix in the hardware.

- It allows a measurement app to cooperate with other apps without
  being stomped on or having to shut down everything else while it runs.

- It could allow uploading a new color profile to the compositor, if
  various compositor projects can agree on how to do that. Given that
  ICC spec exists, I guess there are good chances of succeeding. OTOH,
  desktop projects do tend to dislike any interfaces that attempt to
  bypass their own settings apps.

- It does offer some amount of API abstraction.

However, the extension will be specific to Wayland so you still have a
whole new platform to support in your color tool(kit)s.


Thanks,
pq


pgpb2pXeD4zVX.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-08 Thread Michel Dänzer
On 2019-03-08 9:41 a.m., Graeme Gill wrote:
> Michel Dänzer wrote:
> 
>> It was never reliable for that. Other clients using any of those
>> mechanisms could always interfere, at least for the RandR compatibility
>> output.
> 
> I disagree. It was reliable in the sense that running the
> profile loader set it to a known state, irrespective
> of whatever other applications may have done via
> other API's.

You're assuming that the mechanism used by the profile loader directly
clobbers the HW LUT, making any adjustments made via other mechanisms
ineffective[0]. Even so, other clients can make adjustments using the
other mechanisms (or even the same one) at any time, which would again
clobber the HW LUT, interfering with the profile. Not reliable.

[0] This results in bug reports like
https://bugs.freedesktop.org/27222

> With the behavior changed to combine all the API settings, there is no
> simple way to set it to a known state.

You can set all other mechanisms to pass-through. If you want to be nice
to your users, maybe save their previous states and restore them
afterwards. Of course, if you want to prevent other clients from
interfering, you'd probably need to grab the server (which might result
in the desktop freezing with a compositing manager).


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-08 Thread Michel Dänzer
On 2019-03-07 10:07 p.m., Graeme Gill wrote:
> Michel Dänzer wrote:
> 
>> Yep. The alternative is that the different mechanisms clobber the
>> hardware LUT from each other, which sucks from a user POV.
> 
> Which user though ?
> 
> It certainly does the opposite of suck if you are a user
> who wants reliable color management, and so want a simple
> and direct mechanism to set the post frame buffer manipulation
> to a known state.
> 
> What does suck if you are a color critical user is that
> what used to be a reliable system (i.e. "run the profile
> loader") just became unreliable due to a system update.

It was never reliable for that. Other clients using any of those
mechanisms could always interfere, at least for the RandR compatibility
output.

> From this perspective I'm puzzled as to why such a change
> was implemented.

To make all these mechanisms work reliably and consistently at all times.


>> Welcome to
>> the wonderful world of "colour management" in X, please pick your
>> poison. I guess you can see why Wayland has a different design. :)
> 
> X11 has only got that way with such changes in behavior. It's
> been pretty reliable up to now, and at least is possible
> thanks to some foresight on the designers and implementer s part.

If you have specific suggestions, please post them to the xorg-devel
mailing lists or create a merge request at
https://gitlab.freedesktop.org/xorg/xserver/merge_requests .


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Chris Murphy
On Thu, Mar 7, 2019 at 2:35 PM Graeme Gill  wrote:
>
> Michel Dänzer wrote:
>
> > It sounds like KMS leases could be a pretty good fit for a calibration
> > application. It can lease each output individually from the Wayland
> > compositor and fully control it using KMS APIs, while the Wayland
> > compositor continues running normally on other outputs.
>
> There seems to be this idea that has got a hold amongst many commentators
> on this topic here, that somehow the display calibration and profiling
> application NEEDs raw and direct access to the a display to operate.

Just to underscore how mistaken such an assumption would be, the
self-calibrating displays on the market produce a display state of
behavior (white point, black point, primaries, and tone reproduction
curve), and an ICC profile that describes that state, without any
respect whatsoever to behaviors in the OS or the connection between
display and computer. That ICC profile correctly describes that
display whether the OS is macOS or Windows.

If everything is correctly done, that would also be true for Linux
whether GNOME, KDE, Xorg, or any Wayland compositor. The profile
doesn't describe how wrong a display is, it doesn't describe a
correction. It merely maps RGB values, submitted by an ordinary
application for display, to their color appearance (all kinds of
media, environment, observer assumptions are baked into that term).

The small problem we might have for the rest of the displays that
aren't self-calibrating, is lack of platform support parity for the
various video card LUTs. But for that (important) detail, it should
otherwise be true that an ICC profile describes the display's state
regardless of the display rendering pipeline (ergo, regardless of
platform).



-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Graeme Gill
Michel Dänzer wrote:

> It sounds like KMS leases could be a pretty good fit for a calibration
> application. It can lease each output individually from the Wayland
> compositor and fully control it using KMS APIs, while the Wayland
> compositor continues running normally on other outputs.

There seems to be this idea that has got a hold amongst many commentators
on this topic here, that somehow the display calibration and profiling
application NEEDs raw and direct access to the a display to operate.

If you have actually sat down and written such an application yourself,
(which I have, and have been maintaining on 3 separate operating systems
for the last 10 years +), then I'm willing to hear your reasons for
thinking that.

If you haven't actually written such an application (or at least
seriously sat down to understand color management and thought through
the implications it has on the functional requirements of such applications),
then let me state again that YOU ARE COMPLETELY MISTAKEN. The reality
is the opposite of what you are thinking. A calibration and profiling
application needs access to the display IN ITS NORMAL OPERATING STATE.
That is the only way it can be certain that it is characterizing its
operating color behavior, and that the calibration machinery is behaving
the same way it does when working with actual applications.

It's only a "special" application in needing to be able to position
a window on a specific screen at a specific location in such a way that
the window is not obscured, and has access to the color management configuration
(setting calibration curves & installing color profiles).
In every other way it is an ordinary application that can regularly run
alongside all the other ordinary applications, and needs access
to all the ordinary application facilities for providing a UI that
the user can use (and access to connected hardware, typically via
USB or Bluetooth).

Thanks,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Graeme Gill
Michel Dänzer wrote:

> As of xserver 1.19, if the Xorg driver calls xf86HandleColormaps(), all
> relevant mappings (colormap, global gamma, xf86VidMode per-X-screen
> ramp, RandR per-CRTC ramp) are composed, and the result is applied to
> the hardware LUT for all CRTCs.

It's disappointing a change with such serious implications for
X11 color management was made without any consultation or even
notification to those that would be affected.

What order are all these things composed ?

Thanks,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Graeme Gill
Chris Murphy wrote:

Hi Chris,

> Not every desktop environment is using the same Wayland compositor, or
> even a Wayland compositor at all. So is drm/kms something you can
> depend on most of the time regardless of the desktop?

I'm sure you could get even wider coverage of color management
tools by writing software that talks directly to the hardware,
duplicating all the graphics and mouse drivers etc. that MSWindows/OS X/Linux
implements - but what team of people is going to write and support that,
and how convenient is it to use if you have to re-boot to run it,
and what use are the display profiles if you have no confidence that
they are valid for the graphics sub-system you actually want to use
them with, and what use are such profiles if you have no reliable way
of installing them on the system you want to use them on ?

If drm/kms is such a great application target, why is Wayland being
developed at all - why isn't every Linux application written to
talk to drm/kms ?

> Android is off on its own I suspect. What's Chrome OS using? I dunno,
> maybe that's not where you want to be? :-D

Irrelevant - they have their own design as operating systems,
and may or may not have color management API's. Android has had
no reasonable prospect of CM up to now (just like Wayland!), but
recently started to add support in its own fashion. If Chrome
follows suite, they will soon be far ahead of Wayland in this
regard.

Graeme.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Chris Murphy
On Thu, Mar 7, 2019 at 3:15 AM Michel Dänzer  wrote:
>
> On 2019-03-07 8:05 a.m., Chris Murphy wrote:

> > Of course. It can take 5-30 minutes to do a calibration and
> > characterization. In particular if I have 2, 3 or even 4 displays
> > connected I'd want to calibrate them in sequence while the others are
> > being used for useful tasks.
>
> It sounds like KMS leases could be a pretty good fit for a calibration
> application. It can lease each output individually from the Wayland
> compositor and fully control it using KMS APIs, while the Wayland
> compositor continues running normally on other outputs.

If developers of such applications need to do substantially different
things on each platform, it's a big negative. The only point for using
drm/kms is if it's easier for the developer, whether they use it
directly or through some kind of abstraction that can be expected to
work in the same code base and easily do the right thing whether the
application depends on Xorg or an arbitrary Wayland compositor.
Otherwise, it's effectively a proliferation of platforms.

I don't have numbers, but my handwavy guess is full screen versus
floating window display calibration applications is about 50/50. So
it's a substantial UI/UX change to ask some developers to expect to
have to rebuild for full screen only, rather than some kind of cut out
in a window.

Also, calibration/profiling tools do verification and are often used
for troubleshooting and diagnosis. So they are extra special in that
they need a calibration mode, as well as a normal mode to verify both
calibration and characterization. And if switching between them is
akin to parking the car, getting out, getting into a semi-truck,
driving 100 meters, parking the truck, getting out, walking back to
and getting into the car to resume - that's not just a PITA for them,
it makes them less simple. And the simpler they can be, the easier
they are to maintain, and more trustworthy they are as both diagnostic
and calibration/characterization tools.


-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Michel Dänzer
On 2019-03-07 1:43 p.m., Kai-Uwe wrote:
> Am 07.03.19 um 11:15 schrieb Michel Dänzer:
>> On 2019-03-07 8:05 a.m., Chris Murphy wrote:
>>> On Wed, Mar 6, 2019 at 10:02 PM Graeme Gill  wrote:
 [ And why should Linux/Wayland be crippled compared to
   every other system ? I can and do do things like fire up
   a test patch display using ArgyllCMS/dispwin in one corner of my
   screen while running ArgyllCMS/spotread in another window to
   measure the patches. There's no reason not to, and every reason
   to be able to. ]
>>> Of course. It can take 5-30 minutes to do a calibration and
>>> characterization. In particular if I have 2, 3 or even 4 displays
>>> connected I'd want to calibrate them in sequence while the others are
>>> being used for useful tasks.
>> It sounds like KMS leases could be a pretty good fit for a calibration
>> application. It can lease each output individually from the Wayland
>> compositor and fully control it using KMS APIs, while the Wayland
>> compositor continues running normally on other outputs.
> 
> I am afraid, this concept reads quite exotic for me non kernel
> developer. Would you mind elaborating on that feature without some
> example code or reference on how to write such and how it integrates in
> a DE.

I'll leave that to others who know more about KMS leases.


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Kai-Uwe
Am 07.03.19 um 11:15 schrieb Michel Dänzer:
> On 2019-03-07 8:05 a.m., Chris Murphy wrote:
>> On Wed, Mar 6, 2019 at 10:02 PM Graeme Gill  wrote:
>>> [ And why should Linux/Wayland be crippled compared to
>>>   every other system ? I can and do do things like fire up
>>>   a test patch display using ArgyllCMS/dispwin in one corner of my
>>>   screen while running ArgyllCMS/spotread in another window to
>>>   measure the patches. There's no reason not to, and every reason
>>>   to be able to. ]
>> Of course. It can take 5-30 minutes to do a calibration and
>> characterization. In particular if I have 2, 3 or even 4 displays
>> connected I'd want to calibrate them in sequence while the others are
>> being used for useful tasks.
> It sounds like KMS leases could be a pretty good fit for a calibration
> application. It can lease each output individually from the Wayland
> compositor and fully control it using KMS APIs, while the Wayland
> compositor continues running normally on other outputs.

I am afraid, this concept reads quite exotic for me non kernel
developer. Would you mind elaborating on that feature without some
example code or reference on how to write such and how it integrates in
a DE.

thanks,
Kai-Uwe Behrmann
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Florian Höch
Am 07.03.2019 um 05:26 schrieb Chris Murphy:
> Hmmm. For a while now we've had display calibration+profiling
> applications compel full screen mode

While some calibration/profiing applications are able to display a
fullscreen window for the patch area (some may even default to it), I
know none that implement their own GUI toolkit to do so (which, as I
understand it, is what it would come down to if it were using something
low-level like KMS/DRM). They instead use an existing GUI toolkit of
choice to achieve this (e.g. DisplayCAL uses GTK under Linux, X-Rite i1
Profiler and basICColor use Qt, even though the latter two are not
available on Linux).
There is a big difference between having a "fullscreened" window with
normal GUI elements on it, or basically having to roll your own GUI
framework just to achieve the same. The latter IS a non-starter if there
ever was one.
Also, as an aside, fullscreen window (for the patch area) is not
necessarily desirable for various reasons. Incidentally, many
calibration/profiling apps don't use it by default (e.g. basICColor,
DisplayCAL, etc).

Florian.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Michel Dänzer
On 2019-03-07 8:05 a.m., Chris Murphy wrote:
> On Wed, Mar 6, 2019 at 10:02 PM Graeme Gill  wrote:
>>
>> Chris Murphy wrote:
>>
>>> Hmmm. For a while now we've had display calibration+profiling
>>> applications compel full screen mode so they're not really usable
>>> alongside anything else. They are in effect taking over. So if it's
>>> possible for the calibration app to set aside the Wayland session, use
>>> drm/kms full screen, and then restore the Wayland session I might be
>>> OK with it. But if I have to log out, not OK.
>>
>> Sorry, as a Color Management application writer, I'm
>> not OK with it. I'd be better off firing up my own complete
>> copy of Wayland with the CM API's in it and use it
>> to talk to drm/kms, rather than trying to write an app to
>> talk direct to drm/kms. And of course if that's the easiest
>> course, why do that - just incorporate the CM API's in
>> stock Wayland and be done with it!
> 
> Not every desktop environment is using the same Wayland compositor, or
> even a Wayland compositor at all. So is drm/kms something you can
> depend on most of the time regardless of the desktop?
> 
> [...]
> 
> 
>> [ And why should Linux/Wayland be crippled compared to
>>   every other system ? I can and do do things like fire up
>>   a test patch display using ArgyllCMS/dispwin in one corner of my
>>   screen while running ArgyllCMS/spotread in another window to
>>   measure the patches. There's no reason not to, and every reason
>>   to be able to. ]
> 
> Of course. It can take 5-30 minutes to do a calibration and
> characterization. In particular if I have 2, 3 or even 4 displays
> connected I'd want to calibrate them in sequence while the others are
> being used for useful tasks.

It sounds like KMS leases could be a pretty good fit for a calibration
application. It can lease each output individually from the Wayland
compositor and fully control it using KMS APIs, while the Wayland
compositor continues running normally on other outputs.


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-07 Thread Michel Dänzer
On 2019-03-07 5:38 a.m., Graeme Gill wrote:
> Michel Dänzer wrote:
>> As of xserver 1.19, if the Xorg driver calls xf86HandleColormaps(), all
>> relevant mappings (colormap, global gamma, xf86VidMode per-X-screen
>> ramp, RandR per-CRTC ramp) are composed, and the result is applied to
>> the hardware LUT for all CRTCs.
> 
> Hmm. Yuk from a color management point of view. So when
> I load up an ICC display profile I really need to reset
> all of that to be certain the screen is color calibrated!

Yep. The alternative is that the different mechanisms clobber the
hardware LUT from each other, which sucks from a user POV. Welcome to
the wonderful world of "colour management" in X, please pick your
poison. I guess you can see why Wayland has a different design. :)


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Chris Murphy
On Wed, Mar 6, 2019 at 10:02 PM Graeme Gill  wrote:
>
> Chris Murphy wrote:
>
> > Hmmm. For a while now we've had display calibration+profiling
> > applications compel full screen mode so they're not really usable
> > alongside anything else. They are in effect taking over. So if it's
> > possible for the calibration app to set aside the Wayland session, use
> > drm/kms full screen, and then restore the Wayland session I might be
> > OK with it. But if I have to log out, not OK.
>
> Sorry, as a Color Management application writer, I'm
> not OK with it. I'd be better off firing up my own complete
> copy of Wayland with the CM API's in it and use it
> to talk to drm/kms, rather than trying to write an app to
> talk direct to drm/kms. And of course if that's the easiest
> course, why do that - just incorporate the CM API's in
> stock Wayland and be done with it!

Not every desktop environment is using the same Wayland compositor, or
even a Wayland compositor at all. So is drm/kms something you can
depend on most of the time regardless of the desktop?

Android is off on its own I suspect. What's Chrome OS using? I dunno,
maybe that's not where you want to be? :-D


> [ And why should Linux/Wayland be crippled compared to
>   every other system ? I can and do do things like fire up
>   a test patch display using ArgyllCMS/dispwin in one corner of my
>   screen while running ArgyllCMS/spotread in another window to
>   measure the patches. There's no reason not to, and every reason
>   to be able to. ]

Of course. It can take 5-30 minutes to do a calibration and
characterization. In particular if I have 2, 3 or even 4 displays
connected I'd want to calibrate them in sequence while the others are
being used for useful tasks.

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Graeme Gill
Chris Murphy wrote:

> Hmmm. For a while now we've had display calibration+profiling
> applications compel full screen mode so they're not really usable
> alongside anything else. They are in effect taking over. So if it's
> possible for the calibration app to set aside the Wayland session, use
> drm/kms full screen, and then restore the Wayland session I might be
> OK with it. But if I have to log out, not OK.

Sorry, as a Color Management application writer, I'm
not OK with it. I'd be better off firing up my own complete
copy of Wayland with the CM API's in it and use it
to talk to drm/kms, rather than trying to write an app to
talk direct to drm/kms. And of course if that's the easiest
course, why do that - just incorporate the CM API's in
stock Wayland and be done with it!

[ And why should Linux/Wayland be crippled compared to
  every other system ? I can and do do things like fire up
  a test patch display using ArgyllCMS/dispwin in one corner of my
  screen while running ArgyllCMS/spotread in another window to
  measure the patches. There's no reason not to, and every reason
  to be able to. ]

Cheers,
Graeme.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Graeme Gill
Michel Dänzer wrote:
> As of xserver 1.19, if the Xorg driver calls xf86HandleColormaps(), all
> relevant mappings (colormap, global gamma, xf86VidMode per-X-screen
> ramp, RandR per-CRTC ramp) are composed, and the result is applied to
> the hardware LUT for all CRTCs.

Hmm. Yuk from a color management point of view. So when
I load up an ICC display profile I really need to reset
all of that to be certain the screen is color calibrated!

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Graeme Gill
Adam Jackson wrote:
> Sure, but one would not expect to control the display's global
> calibration state from an X client in this model, for broadly the same
> reasons that RANDR under Xwayland is read-only. The wayland server owns
> that state, the Xwayland server is simply a very demanding wayland
> client.

Right, but there needs to be facility for privileged Wayland clients
that can configure the wayland server as an agent of the user.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Chris Murphy
On Wed, Mar 6, 2019 at 7:12 PM Graeme Gill  wrote:
>
> Carsten Haitzler wrote:
> > for the purposes of calibration, imho a calibration tool should just use
> > drm/kms directly and run in a console outside of wayland.
>
> Sorry, but that's a total non-starter. Calibration & profiling
> tools are applications, and need to run in a normal application
> environment to interact with the user. It's like saying that
> the user should switch to console only mode to mount a drive
> of change a file permission. No-one expects GUI based systems to
> operate that way.

Hmmm. For a while now we've had display calibration+profiling
applications compel full screen mode so they're not really usable
alongside anything else. They are in effect taking over. So if it's
possible for the calibration app to set aside the Wayland session, use
drm/kms full screen, and then restore the Wayland session I might be
OK with it. But if I have to log out, not OK.


--
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Graeme Gill
Carsten Haitzler wrote:
> On Wed, 6 Mar 2019 16:37:55 +1100 Graeme Gill  said:

> it involves a screen or set of screens "flashing" between different
> colorspaces. it's much the same kind of effect of ye olde colormap installs.
> not as extreme, but still the entire screen content changing appearance as one
> client is taking control.

That's not a problem if that is what the user is choosing to do.
i.e. they are choosing to have a calibrated screen, or choosing
to use a "blue light filter". Their explicit intention is to
change the appearance of their whole display.

>> A game doing this - yes, it's setting it up just for its
>> private use.
> 
> a game should not either. it's a window (surface) within a larger ecosystem 
> and
> doesn't own the display. the compositor's job is to ensure that everyone plays
> nice together and to ensure the user has as seamless an experience as 
> possible.

Sure - there shouldn't be side effects of something that is for the
use of a single application windows.

> the client can provide colorspace info/lut's alongside a surface/buffer and
> then leave it up to the compositor to implement it or not (alongside perhaps
> the ability to query for such extended features if they exist in the
> compsitor at all - this is a question though of what is the baseline
> featureset we expect from compositors)

Right.

> but it shouldn't CONTROL the output (via any explicit or implicit protocol and
> specs). if the compositor chooses to remap everything on screen so that the
> appearance remains constant based on the focused surface (surface A with
> colorspace X and other surfaces with colorspace Y it can transform to a single
> colrospace based on what the screen is configured to display at the time to
> provide visually a constant experience if possible).

Of course a configuration application should control the compositor. That's
what it's purpose is - to serve the user in exercising their control over their
compositors behavior.

> for the purposes of calibration, imho a calibration tool should just use
> drm/kms directly and run in a console outside of wayland.

Sorry, but that's a total non-starter. Calibration & profiling
tools are applications, and need to run in a normal application
environment to interact with the user. It's like saying that
the user should switch to console only mode to mount a drive
of change a file permission. No-one expects GUI based systems to
operate that way.

> it then owns the
> display. it's not like it's a commonly used tool (likely once on purchase of a
> gpu and/or a monitor).

For those to whom it's vital, it's a commonly used tool. It might
be used once a month, once a week, or needed right now, before some
color critical work is performed. It may be needed to profile
a printer, profile a scanner, or do a soft proof preview.
And switching to some pixel processing pipeline that is
not exactly the same as the composer is exactly the _opposite_
of what is required for assurance that the profile is valid.
A profile should be made with as identical a workflow to
normal as possible.

> it shouldn't even be needed for pre-calibrated monitors.

That's simply not true. Few except very high end monitors come
from the factory with that level of color reproducability.
And even then, anyone doing color critical things can't take
the display manufacturers word - the only way to have confidence
that a display is faithfully emulating a particular colorspace
is to profile it. (And it won't be faithfully emulating anyway
- the black levels will be different to an abstract standard
colorspace.)

> it can calibrate alongside appropriate colorimeter tools and work out some
> profile screen by screen to be given to the compositor in a standard well
> documented format. the compositor then can use that profile to know the true
> color output of that screen and can appropriately adjust content.

Yes, that would be an ICC device profile.

> blue light filter is a "compositor problem". it may or may not farm it off to 
> a
> client but it's not something that should be allowed by clients in general -
> these are at best speciality clients that work closely with a compositor, or
> more likely something compositor specific via whatever extension mechanisms
> that compositor supports.

Yes. "Specialty clients" are clients. "Configuration clients" are clients.
Clients work via API's. API's for configuration of the Compositor are
needed to allow the user to configure it the way they want and need.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Chris Murphy
On Wed, Mar 6, 2019 at 3:15 AM Carsten Haitzler  wrote:
>
> for the purposes of calibration, imho a calibration tool should just use
> drm/kms directly and run in a console outside of wayland. it then owns the
> display. it's not like it's a commonly used tool (likely once on purchase of a
> gpu and/or a monitor).

Weekly or monthly is common once you own such hardware. Display
behavior changes quite a bit, and can be highly variable depending on
the component sourcing for the light source.

About the only device profile good for the life of the device, is a
camera profile.


-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Michel Dänzer
On 2019-03-06 5:00 p.m., Adam Jackson wrote:
> On Wed, 2019-03-06 at 15:52 +1100, Graeme Gill wrote:
>> Adam Jackson wrote:
>>
>>> The second, which games typically use, is setting per-channel gamma
>>> (implicitly for the whole screen) as single floating-point values with
>>> the xf86vidmode extension.
>>
>> Typically this is too crude a control for color management use.
>> My assumption (which could be wrong) is that this is overridden
>> by the per-crtc LUT.
> 
> Technically I think what happens is the vidmode gamma is applied to
> RANDR's "compat" output, which is chosen by some handwavey heuristic.

Before xserver 1.19, the xf86VidMode per-X-screen ramp (with the global
gamma value controlled by xgamma incorporated) and the RandR per-CRTC
ramp were each directly applied to the hardware LUT for the RandR
compatibility output without coordination, so whichever was set last
stuck. (Colormaps were ineffective with RandR 1.2 capable drivers since
xserver 1.7)

As of xserver 1.19, if the Xorg driver calls xf86HandleColormaps(), all
relevant mappings (colormap, global gamma, xf86VidMode per-X-screen
ramp, RandR per-CRTC ramp) are composed, and the result is applied to
the hardware LUT for all CRTCs.

(A bug snuck into 1.19 which resulted in the composed mapping being
incorrect for the RandR compatibility output between setting the global
gamma value and setting the RandR per-CRTC ramp. This is fixed in 1.20.4)


-- 
Earthling Michel Dänzer   |  https://www.amd.com
Libre software enthusiast | Mesa and X developer
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Adam Jackson
On Wed, 2019-03-06 at 15:52 +1100, Graeme Gill wrote:
> Adam Jackson wrote:
> 
> > X kinda has three mechanisms for this. The first one, that nobody
> > really uses, is setting the colormap for a DirectColor visual.
> 
> Actually this is something I check and set to linear before
> calibration & profiling in the ArgyllCMS tools.

Yeah, I was corrected about this on IRC as well. Apparently DirectColor
really is a thing people use.

> > The second, which games typically use, is setting per-channel gamma
> > (implicitly for the whole screen) as single floating-point values with
> > the xf86vidmode extension.
> 
> Typically this is too crude a control for color management use.
> My assumption (which could be wrong) is that this is overridden
> by the per-crtc LUT.

Technically I think what happens is the vidmode gamma is applied to
RANDR's "compat" output, which is chosen by some handwavey heuristic.

> > All of these are effectively the program specifying its transformation
> > to what it hopes is linear in device space. The sample server happens
> > to implement all three as global state, but that's an implementation
> > detail. It would be straightforward to give each Xwayland client the
> > illusion of complete control if we wanted.
> 
> For the purposes of setting the display global color calibration state,
> then this is not desirable.

Sure, but one would not expect to control the display's global
calibration state from an X client in this model, for broadly the same
reasons that RANDR under Xwayland is read-only. The wayland server owns
that state, the Xwayland server is simply a very demanding wayland
client.

- ajax

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-06 Thread Carsten Haitzler
On Wed, 6 Mar 2019 16:37:55 +1100 Graeme Gill  said:

> Carsten Haitzler (The Rasterman) wrote:
> > apps should not have exclusive access. we're re-doing the whole horrid
> > "install colormap" thing from the x days of 256 color (or
> > paletted/colormapped displays).
> 
> It's not quite the same thing in all cases.

it involves a screen or set of screens "flashing" between different
colorspaces. it's much the same kind of effect of ye olde colormap installs.
not as extreme, but still the entire screen content changing appearance as one
client is taking control.

> A game doing this - yes, it's setting it up just for its
> private use.

a game should not either. it's a window (surface) within a larger ecosystem and
doesn't own the display. the compositor's job is to ensure that everyone plays
nice together and to ensure the user has as seamless an experience as possible.

the client can provide colorspace info/lut's alongside a surface/buffer and
then leave it up to the compositor to implement it or not (alongside perhaps
the ability to query for such extended features if they exist in the
compsitor at all - this is a question though of what is the baseline
featureset we expect from compositors)

but it shouldn't CONTROL the output (via any explicit or implicit protocol and
specs). if the compositor chooses to remap everything on screen so that the
appearance remains constant based on the focused surface (surface A with
colorspace X and other surfaces with colorspace Y it can transform to a single
colrospace based on what the screen is configured to display at the time to
provide visually a constant experience if possible).

for the purposes of calibration, imho a calibration tool should just use
drm/kms directly and run in a console outside of wayland. it then owns the
display. it's not like it's a commonly used tool (likely once on purchase of a
gpu and/or a monitor). it shouldn't even be needed for pre-calibrated monitors.
it can calibrate alongside appropriate colorimeter tools and work out some
profile screen by screen to be given to the compositor in a standard well
documented format. the compositor then can use that profile to know the true
color output of that screen and can appropriately adjust content.

> Color calibration or "blue light filter" not really - they
> are using it as a mechanism for deliberately altering
> the color of the whole display so that it affects the appearance
> of all other applications.

blue light filter is a "compositor problem". it may or may not farm it off to a
client but it's not something that should be allowed by clients in general -
these are at best speciality clients that work closely with a compositor, or
more likely something compositor specific via whatever extension mechanisms
that compositor supports.

implementing any such protocol for clients is just going back in time to
clients messing with key repeat and users wondering why their key repeat is now
broken as the client doesn't do it right or colormap installs forcibly being
done by clients, or clients messing  with screensaver params to try force it
to turn off then users complaining that screen blanking is broken and it ends up
being some random client they didn't know about, or the screen resolution
changing and multi-monitor config being nuked because some dumb client decided
to use xf86vidmode to change screen res not knowing that people might have
multiple screens configured via xinerama or xrandr and then not even bothering
to restore things afterwards either. i've seen many of the outcomes and
problems of giving clients direct control over the screen over my decades in
x11, and it's a bad thing. we shouldn't repeat the same mistakes. once you open
up these doors, they are impossible to close because you now need them for
compatibility.

> Whether the latter two are in conflict is an interesting question.
> 
> For the purposes of getting a known color behavior they are
> in conflict. But then they could also co-operate :- the
> "blue light filter" could make use of color management
> to implement a specific transform, and do it in such a way
> that the white point relative color behavior remains unchanged.
> 
> Cheers,
>   Graeme Gill.
> 
> ___
> wayland-devel mailing list
> wayland-devel@lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/wayland-devel

-- 
- Codito, ergo sum - "I code, therefore I am" --
Carsten Haitzler - ras...@rasterman.com

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-05 Thread Graeme Gill
Pekka Paalanen wrote:
> I presume the measurement or calibration use case always involves
> "owning" the whole monitor, and the very specific monitor at that. That
> is, making the monitor temporarily exclusive to the app, so that
> nothing else can interfere (e.g. instant messaging notification popping
> up just under the measurement sensor). That would also give an
> opportunity, if wanted(!), to bypass compositor color conversions and
> do or not do anything else special.

It really doesn't have to have exclusive access. Being able
to display a window at a specific location & size and in a
way that can't be overlayed by any other window or hidden by
a screensaver is sufficient.

> IOW, measurement/calibration/characterization is off the scope of the
> currently on-going discussions.

A separate protocol yes. But it needs to be co-designed and
developed so that the two work together and can be tested
together. You can't sign off on the "using" protocol without
knowing that it actually works with the "setting" protocol,
and its making life hard to try and test the first without the
existence of the second.

Cheers,
Graeme Gill.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-05 Thread Graeme Gill
Carsten Haitzler (The Rasterman) wrote:
> apps should not have exclusive access. we're re-doing the whole horrid 
> "install
> colormap" thing from the x days of 256 color (or paletted/colormapped 
> displays).

It's not quite the same thing in all cases.

A game doing this - yes, it's setting it up just for its
private use.

Color calibration or "blue light filter" not really - they
are using it as a mechanism for deliberately altering
the color of the whole display so that it affects the appearance
of all other applications.

Whether the latter two are in conflict is an interesting question.

For the purposes of getting a known color behavior they are
in conflict. But then they could also co-operate :- the
"blue light filter" could make use of color management
to implement a specific transform, and do it in such a way
that the white point relative color behavior remains unchanged.

Cheers,
Graeme Gill.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-05 Thread Graeme Gill
Simon Ser wrote:
> On Monday, March 4, 2019 8:13 AM, Graeme Gill  wrote:

>> 2) Implement virtual per channel LUTs, with the compositor combining them
>>together in some way, and have some means of the color management 
>> applications
>>being aware when the display is being interfered with by another 
>> application,
>>so that the user can be warned that the color management state is invalid.
> 
> Is there a "good way" to combine multiple LUTs?

It might be possible to compute aproximately color correct
device value curves that can be combined together, based
on the ICC profile characterization. The advantage
of this is that it could be applied to a display
without any application having to be aware of it
(although a color critical application may want to
be able to warn the user.)

>> 1) A color managed API that lets an application shift the display white point
>>using chromatic adaptation, so that such blue light filter applications
>>can operate more predictably, as well as some means of the color 
>> management
>>applications being aware of when this is happening.
> 
> How should this look like? Disclaimer: I have no idea how these applications
> work and I know nothing about color management.

The logical way of supporting this in ICC profile terms would be
to allow for the insertion of an ICC Abstract Profile in
the color conversions (This is a PCS -> PCS transform. So
you would do a Source Dev->PCS, Abstract PCS->PCS, Destination PCS->Dev).
A chromatically correct white point shift would be pretty
simple to specify as an abstract profile.

Conversions done by the Compositor could incorporate the abstract
profile in the linking. Conversions done by color aware applications
could not be forced to honor the abstract profile, but they could choose
to honor it, and they would explicitly be aware of the color
modification although not the reason/intent of it without
some extra meta information.

> I'm guessing this is a restriction of the "change the whole LUTs" API. Are 
> there
> any features the "blue light filter" app won't be able to implement when
> switching to this API?

Good question. I'm not aware of the range of applications that do this
kind of thing. I guess a search of open source apps that use the
relevant API's might give a clue.

> Would the compositor part become complicated (judging
> from [2] it seems different "blue light filter" apps may compute LUTs
> differently)?

A little, it shouldn't add much since lcms supports Abstract profiles.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-05 Thread Graeme Gill
Adam Jackson wrote:

Hi,

> X kinda has three mechanisms for this. The first one, that nobody
> really uses, is setting the colormap for a DirectColor visual.

Actually this is something I check and set to linear before
calibration & profiling in the ArgyllCMS tools.

> The
> second, which games typically use, is setting per-channel gamma
> (implicitly for the whole screen) as single floating-point values with
> the xf86vidmode extension.

Typically this is too crude a control for color management use.
My assumption (which could be wrong) is that this is overridden
by the per-crtc LUT.

> The third, which desktop environments
> sometimes use to try to make distinct displays look similar, is setting
> per-crtc gamma as 256 (or whatever) stops per channel with the RANDR
> extension.

This is the avenue for implementing the Apple/ICC 'vcgt' calibration tag
under X11 (and analogous to the API's in other operating systems).

> All of these are effectively the program specifying its transformation
> to what it hopes is linear in device space. The sample server happens
> to implement all three as global state, but that's an implementation
> detail. It would be straightforward to give each Xwayland client the
> illusion of complete control if we wanted.

For the purposes of setting the display global color calibration state,
then this is not desirable.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Chris Murphy
On Mon, Mar 4, 2019 at 3:20 AM Pekka Paalanen  wrote:
>
> X11 apps have (hopefully) done hardware LUT manipulation only through
> X11. There was no other way AFAIK, unless the app started essentially
> hacking the system via root privileges.
>
> The current DRM KMS kernel design has always had a "lock" (the DRM
> master concept) so that if one process (e.g. the X server) is in
> control of the display, then no other process can touch the display
> hardware at the same time.
>
> Before DRM KMS, the video mode programming was in the Xorg video
> drivers, so if an app wanted to bypass the X server, it would have had
> to poke the hardware "directly" bypassing any drivers, which is even
> more horrible that it might sound.

Sounds pretty horrible. Anyway, I'm definitely a fan of the answer to
the question found here: http://www.islinuxaboutchoice.com/

It sounds like legacy applications use XWayland, and in some edge case
request for a literal video hardware LUT, this could be some kind of
surface for that app's windows. That seems sane to me. A way to make
such computations almost free is important to them. I think they only
ever cared about doing it with a hardware LUT because it required no
CPU or GPU time. In really ancient times, display compensation (i.e.
do a transform from sRGB to mydisplayRGB to compensate for the fact my
display is not really an sRGB display) performance was variable. A few
companies figured out a way to do this really cheaply, even Apple had
a way to apply a non-LUT, lower quality profile, to do display
compensation with live Quicktime video, over 20 years ago. Meanwhile,
one of the arguments the Mozilla Firefox folks had for moving away
from lcms2 in favor of qcms was performance, but even that wasn't good
enough performance wise for always on display compensation. I still
don't know why, other than I recognize imaging pipelines are
complicated and really hard work.

Also in that era, before OS X, was configurable transform quality
performance settings: fast, good, best - or something like that. For
all I know, best is just as cheap these days as fast and you don't
need to distinguish such things. But if you did, I think historic
evidence shows only fast and best matter. Fast might have meant taking
a LUT display profile and describing the TRC with a gamma function
instead, or 4 bits per channel instead of 8, and 8 instead of 16.
These days I've heard there are hardware optimizations for floating
point that makes it pointless to do integer as a performance saving
measure.

Back then we were really worried about get a display the "correct 8
bits per channel" since that was the pipeline we had, any video
hardware LUT for calibration took away bits from that pipeline.

And that's gotten quite a lot easier these days because at the not
even super high end, there are commodity displays that are calibrated
internally and supply a minimum 8 bits per channel, often now 10 bits
per channel, pipeline. On those displays, we don't even worry about
calibration on the desktop. And that means the high end you get almost
for free from an application standpoint. The thing to worry about are
shitty laptop displays which might have 8 bit per channel addressible
but aren't in any sense really giving us that much precision, it might
be 6 or 7. And there's a bunch of abstraction baked into the panel
that you have no control over that limits this further. So you kinda
have to be careful about doing something seemingly rudimentary like
changing its white point from D75 to D55. Hilariously, it can be the
crap displays that'll cause the most grief, not the high end use case.

OK so how do you deal with that? Well, it might in fact be that you
don't want to force accuracy, but rather re-render with a purpose to
make it look as decent as possible on that display even if the
transforms you're doing aren't achieving a colorimetric ally accurate
result that you can measure, but do achieve a pleasing result. I know
it's funny - how the frak do we distinguish between these use cases?
And then what happens if you have what seems to be a mixed case of a
higher end self-calibrating display connected to a laptop with a
shitty display? Haha, yeah you can be fakaked almost no matter what
choices you make. That's my typical setup, by the way.


> > Even if it turns out the application tags its content with displayRGB,
> > thereby in effect getting a null transform, (or a null transform with
> > whatever quantization happens through 32bpc float intermediate color
> > image encoding), that's functionally a do not color manage deviceRGB
> > path.
>
> What is "displayRGB"? Does it essentially mean "write these pixel
> values to any monitor as is"? What if the channel value's data type
> does not match?

Good question. I use 'displayRGB' as generic shorthand for the display
profile, which is different on every system. On my system right now
it's /home/chris/.local/share/icc/edid-388f82e68786f1c5ac552f0b4d0c945f.icc
but it's 

Re: HDR support in Wayland/Weston

2019-03-04 Thread Chris Murphy
On Mon, Mar 4, 2019 at 1:32 AM Simon Ser  wrote:
>
> On Monday, March 4, 2019 8:13 AM, Graeme Gill  wrote:
> > And the current favorite is "blue light filter" effects, for which numerous
> > applications are currently available. They tweak the white point
> > of the display by arbitrarily modifying the hardware per channel LUTs.
> > (i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)
> >
> > Such applications have their place for those who like the effect, but
> > ideally such usage would not simply blow color management away.
>
> FWIW wlroots has a protocol for this [1]. GNOME and KDE have this feature
> directly integrated in their compositor.

Another interesting use case, Flatpaks. Say I have a flatpak
application which depends on KDE, so I have org.kde.Platform runtime
installed, how does an application that wants to alter display white
point work in such a case? It'll presumably go through the KDE runtime
to try to do this, but then how does that get communicated when this
is actually running on GNOME? And then how does it properly get reset
once that application quits?

I don't really need a literal explanation. I'm fine with simplistic
trust based explanations.

> Is there a "good way" to combine multiple LUTs?

Whether 1D, 2D, 3D LUTs, they can be concatenated yes. I'll let Graeme
answer the math aspect of it, he probably knows it off the top of his
head. I don't. But yes it's possible, and yes it's really easy to do
it wrong in particular when there are white point differences. But
likely also more complicated in an HDR context. So yeah you'll
probably want an API for it.

Another use case for this would be simulating different standard
observers, including dichromacy and anomalous trichromacy (of which
there are several each but only a couple are particularly common).
When designing user interfaces, or signage, or advertising, it's a
really good idea to check if some substantial number of users are
going to get hit with confusion because they can't distinguish
elements in a design. This is normally implemented in the application
rather than in the OS, but macOS does have an option for this.

And now that I think of it, I'm not sure they're even using 'space'
class ICC profiles but rather 'abstract' class. You can think of the
'abstract' class as an analog to 'device link' where the device link
is implies a direct deviceA>deviceB, or even direct deviceA>deviceC
(the A>B>C concatenation is precomputed when the device link is
built), the abstract profile works in non-device color spaces such as
L*a*b* or XYZ, you can think of them as a kind of effects filter.

And for that matter, the "blue light filter" can be described with an
abstract profile, or even two of them to define the endpoints of a
continuum with something like a slider that effectively concatenates
them with a varible coefficient to weight them however the user wants,
with like a slider or something.

These things don't necessarily need to be literal ICC profiles as a
file on disk if you don't want. The non-LUT display class (primaries
plus tone curve which can be defined as a gamma function or even
parametrically), space class, and abstract class are typically pretty
tiny, they could be created on the fly and fed into lcms2 as virtual
profiles, and let it handle all the transforms with its well tested
math. Named color class, device link class, output class, input class,
can be quite a bit larger, and even massive in some cases - but off
hand I don't see a use case for them here.

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Chris Murphy
On Mon, Mar 4, 2019 at 12:13 AM Graeme Gill  wrote:
>
> Chris Murphy wrote:
>
> > A common offender were games. They'd try to access the video card LUT
> > directly for effects, but then not reset it back to what it was,
> > rather reset it to a hardwired assumption the game makes,
>
> And the current favorite is "blue light filter" effects, for which numerous
> applications are currently available. They tweak the white point
> of the display by arbitrarily modifying the hardware per channel LUTs.
> (i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)

It's a really good example. I have no idea how they work.  GNOME has
one built-in called "Night Light" and I use it on Wayland. So if
Wayland no longer supports applications altering video hardware LUTs,
then I'm not sure how it does it for every pixel from every
application, both Wayland and XWayland.


> Such applications have their place for those who like the effect, but
> ideally such usage would not simply blow color management away.

I've never been suspicious upon reset but I admit I've never measured
a before and after.


> In order of desirability it would be nice to:
>
> 3) Have the hardware Luts restored after an application that uses them
>exits (i.e. like OS X handles it).
>
> 2) Implement virtual per channel LUTs, with the compositor combining them
>together in some way, and have some means of the color management 
> applications
>being aware when the display is being interfered with by another 
> application,
>so that the user can be warned that the color management state is invalid.
>
> 1) A color managed API that lets an application shift the display white point
>using chromatic adaptation, so that such blue light filter applications
>can operate more predictably, as well as some means of the color management
>applications being aware of when this is happening.
>

Yep this could be done with the ICC 'color space' class very cheaply,
built on the fly even. And applied in the intermediate color space so
it affects every pixel for every display regardless of source.

As for notification I think it's more practical and useful that we'd
find this in a system UI element, like settings, related to the
display profile and calibration state. Getting messages between
applications is harder, there isn't universal agreement on using dbus,
or even asking colord what the state is. But the system settings,
devices > displays > color panel, could certainly put up a ! in a red
triangle and if you click it or hover you get a notice that the
display state is not calibrated; or some such. It may in fact still be
calibrated, but chromatically adapted with a different white point in
mind, but I think most users will better grok "not calibrated" than
some more technically accurate description.


-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Simon Ser
On Monday, March 4, 2019 10:50 AM, Carsten Haitzler  
wrote:
> > How should this look like? Disclaimer: I have no idea how these applications
> > work and I know nothing about color management.
> > I'm guessing this is a restriction of the "change the whole LUTs" API. Are
> > there any features the "blue light filter" app won't be able to implement 
> > when
> > switching to this API? Would the compositor part become complicated (judging
> > from 2 it seems different "blue light filter" apps may compute LUTs
> > differently)?
> > Since many compositors (GNOME, KDE, wlroots, maybe more) implement a way to
> > apply a "blue light filter", I think it's important to be able to notify 
> > color
> > management applications that they don't have exclusive access. Or maybe this
> > should just be handled internally by the compositor? (Display a warning or
> > something?)
>
> apps should not have exclusive access. we're re-doing the whole horrid 
> "install
> colormap" thing from the x days of 256 color (or paletted/colormapped 
> displays).

Yes, of course. I should've said "that the compositor won't display
accurate colors".
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Kai-Uwe
Am 04.03.19 um 09:32 schrieb Simon Ser:
> On Monday, March 4, 2019 8:13 AM, Graeme Gill  wrote:
>> And the current favorite is "blue light filter" effects, for which numerous
>> applications are currently available. They tweak the white point
>> of the display by arbitrarily modifying the hardware per channel LUTs.
>> (i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)
>>
>> Such applications have their place for those who like the effect, but
>> ideally such usage would not simply blow color management away.
> FWIW wlroots has a protocol for this [1]. GNOME and KDE have this feature
> directly integrated in their compositor.

These tools work in a assumption of all device RGB is the same (sRGB).
That is, mildly written, not ideal.

The shortcomings of ignoring the output color space becomes evident,
when one does multi monitor white point adjustment. A BT2020 and a
REC709 or DCI3 display will simply widely disagree on red light
modification with manipulating the 1D LUT without considering the device
specifics. The correct way is:

* convert R+G+B ramps to CIE*XYZ (usually by a ICC profile)
* do chromatic adaption in CIE*XYZ, monitor white point -> common
destination white point (D50, K2800...)
* convert back to per output device RGB [CIE*XYZ->deviceRGB]
* merge with ICC profile VCGT calibration ramp, and finally
* apply the 1D LUT for the correct output

All the above tools do not take care of the output profile and apply to
all outputs the same device unspecific manipulation assuming sRGB.
(Assuming sRGB as device primaries performs pretty bad in face of BT2020
/ HDR.)

>> In order of desirability it would be nice to:
>>
>> 3) Have the hardware Luts restored after an application that uses them
>>exits (i.e. like OS X handles it).
> Agreed. This is done in our protocol and there's no such issue when builtin in
> the compositor.
>
>> 2) Implement virtual per channel LUTs, with the compositor combining them
>>together in some way, and have some means of the color management 
>> applications
>>being aware when the display is being interfered with by another 
>> application,
>>so that the user can be warned that the color management state is invalid.
> Is there a "good way" to combine multiple LUTs?

You mean 1D LUT, yes in CIE*XYZ or a other liner blending space. See above.

>> 1) A color managed API that lets an application shift the display white point
>>using chromatic adaptation, so that such blue light filter applications
>>can operate more predictably, as well as some means of the color 
>> management
>>applications being aware of when this is happening.
> How should this look like? Disclaimer: I have no idea how these applications
> work and I know nothing about color management.
>
> I'm guessing this is a restriction of the "change the whole LUTs" API. Are 
> there
> any features the "blue light filter" app won't be able to implement when
> switching to this API? Would the compositor part become complicated (judging
> from [2] it seems different "blue light filter" apps may compute LUTs
> differently)?

White point adaption is singe very long time part of the ICC spec.

> Since many compositors (GNOME, KDE, wlroots, maybe more) implement a way to
> apply a "blue light filter", I think it's important to be able to notify color
> management applications that they don't have exclusive access. Or maybe this
> should just be handled internally by the compositor? (Display a warning or
> something?)
A color management front end can take care of such things, but only if
it can know, that a manipulation is in action. Be that manipulation
visual impairment simulation, white point altering, saturation altering,
sepia effect, you name it 
> [1]: 
> https://github.com/swaywm/wlr-protocols/blob/master/unstable/wlr-gamma-control-unstable-v1.xml
> [2]: https://github.com/jonls/redshift/blob/master/README-colorramp

Both links do not talk about device specific LUT manipulation. They are
color management unaware and the outcome will vary from device to device.

That said, blue light reduction is a desirable feature. I enjoy it each
day/night using ICC style color management.

Kai-Uwe Behrmann
-- 
www.oyranos.org

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Graeme Gill
Chris Murphy wrote:

Hi Chris,

> Well you need a client to do display calibration which necessarily
> means altering the video LUT (to linear) in order to do the
> measurements from which a correction curve is computed, and then that
> client needs to install that curve into the video LUT. Now, colord
> clearly has such capability, as it's applying vcgt tags in ICC
> profiles now. If colord can do it, then what prevents other clients
> from doing it?

my suggestion is not to make the profiling application deal in
these sort of nuts and bolts. If there is an API to install
a profile for a particular output, then the Compositor can
take responsibility for implementing the ICC 'vcgt' tag.
It can choose to implement it in hardware (CRTC), or any other
way it wants, as long as it is implemented so that it
doesn't disadvantage the result compared to implementing it in hardware.

Cheers,
Graeme.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Pekka Paalanen
On Fri, 1 Mar 2019 12:47:05 -0700
Chris Murphy  wrote:

> On Fri, Mar 1, 2019 at 3:10 AM Pekka Paalanen  wrote:
> >
> > On Thu, 28 Feb 2019 18:28:33 -0700
> > Chris Murphy  wrote:
> >  
> > > I'm curious how legacy applications including games used to manipulate
> > > actual hardware LUT in a video card, if the application asked the
> > > client to do it, in which case it still could do that?  
> >
> > Hi Chris,
> >
> > right now, in no case.  
> 
> I made a typo.
> s/client/kernel
> 
> Or has LUT manipulation only ever been done via X11?

Hi Chris,

X11 apps have (hopefully) done hardware LUT manipulation only through
X11. There was no other way AFAIK, unless the app started essentially
hacking the system via root privileges.

The current DRM KMS kernel design has always had a "lock" (the DRM
master concept) so that if one process (e.g. the X server) is in
control of the display, then no other process can touch the display
hardware at the same time.

Before DRM KMS, the video mode programming was in the Xorg video
drivers, so if an app wanted to bypass the X server, it would have had
to poke the hardware "directly" bypassing any drivers, which is even
more horrible that it might sound.

Non-X11 apps, such as fbdev, svgalib, DirectFB, etc., would be
something different that I'm not too familiar with. Fbdev was a
standard'ish kernel ABI, while the rest more or less poked the hardware
directly bypassing any kernel drivers, if not using fbdev under the
hood. These I would just ignore, they were running without a window
system to begin with.


> > > a. I've already done color management, I *really do* need deviceRGB
> > > b. display this, its color space is _.  
> >
> > Case b) is already in both of the protocol proposals.
> >
> > Case a) is in Niels' proposal, but I raised some issues with that. It is
> > a very problematic case to implement in general, too, because the
> > compositor is in some cases very likely to have to undo the color
> > transformations the application already did to go back to a common
> > blending space or to the spaces for other outputs.  
> 
> Case a) is a subset of the calibration/characterization application's
> requirement.
> 
> Even if it turns out the application tags its content with displayRGB,
> thereby in effect getting a null transform, (or a null transform with
> whatever quantization happens through 32bpc float intermediate color
> image encoding), that's functionally a do not color manage deviceRGB
> path.

What is "displayRGB"? Does it essentially mean "write these pixel
values to any monitor as is"? What if the channel value's data type
does not match?

I suppose if a compositor receives content with "displayRGB" profile,
assuming my guess above is correct, it would have to apply the inverse
of the blending space to output space transform first, so that the
total result would be a null transform for pixels that were not blended
with anything else.

> > > Both types of applications exist. It might very well be reasonable to
> > > say, yeah we're not going to support use case a.) Such smarter
> > > applications are going to have to do their color management however
> > > they want internally, and transform to a normalized color space like
> > > P3 or Rec.2020 or opRGB and follow use case b.) where they tag all
> > > content with that normalized color space.  
> >
> > Right. We'll see. And measurement/calibration/characterisation
> > applications are a third category completely different to the two
> > above, by window management requirements if nothing else.  
> 
> It is fair to keep track of, and distinguish a display path with:
> 
> 1. no calibration and no/null transform;
> 2. calibration applied, but no/null transform;
> 3. calibration and transform applied.
> 
> The calibration application does need a means of ensuring explicitly
> getting each of those. 1, is needed to figure out the uncorrected
> state and hopefully give the user some guidance on knob settings via
> OSD, and then to take meausurements to compute a corrective curve
> typically going in the video card LUT or equivalent wherever else that
> would go; 2, is needed to build an ICC profile for the display; and 3,
> is needed for verifying the path.
> 
> An application doing color management internally only really needs
> path 2. Nevertheless, that app's need is a subset of what's already
> needed by an application that does display calibration and profiling.

Yes, this is very much why I would prefer
measurement/calibration/characterization applications to use another
protocol extension that is explicitly designed for these needs instead
of or in addition to a generic "color management" extension that is
designed for all apps for general content delivery purposes.

I presume the measurement or calibration use case always involves
"owning" the whole monitor, and the very specific monitor at that. That
is, making the monitor temporarily exclusive to the app, so that
nothing else can interfere 

Re: HDR support in Wayland/Weston

2019-03-04 Thread The Rasterman
On Mon, 04 Mar 2019 08:32:45 + Simon Ser  said:

> On Monday, March 4, 2019 8:13 AM, Graeme Gill  wrote:
> > And the current favorite is "blue light filter" effects, for which numerous
> > applications are currently available. They tweak the white point
> > of the display by arbitrarily modifying the hardware per channel LUTs.
> > (i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)
> >
> > Such applications have their place for those who like the effect, but
> > ideally such usage would not simply blow color management away.
> 
> FWIW wlroots has a protocol for this [1]. GNOME and KDE have this feature
> directly integrated in their compositor.
> 
> > In order of desirability it would be nice to:
> >
> > 3) Have the hardware Luts restored after an application that uses them
> >exits (i.e. like OS X handles it).
> 
> Agreed. This is done in our protocol and there's no such issue when builtin in
> the compositor.
> 
> > 2) Implement virtual per channel LUTs, with the compositor combining them
> >together in some way, and have some means of the color management
> > applications being aware when the display is being interfered with by
> > another application, so that the user can be warned that the color
> > management state is invalid.
> 
> Is there a "good way" to combine multiple LUTs?
> 
> > 1) A color managed API that lets an application shift the display white
> > point using chromatic adaptation, so that such blue light filter
> > applications can operate more predictably, as well as some means of the
> > color management applications being aware of when this is happening.
> 
> How should this look like? Disclaimer: I have no idea how these applications
> work and I know nothing about color management.
> 
> I'm guessing this is a restriction of the "change the whole LUTs" API. Are
> there any features the "blue light filter" app won't be able to implement when
> switching to this API? Would the compositor part become complicated (judging
> from [2] it seems different "blue light filter" apps may compute LUTs
> differently)?
> 
> Since many compositors (GNOME, KDE, wlroots, maybe more) implement a way to
> apply a "blue light filter", I think it's important to be able to notify color
> management applications that they don't have exclusive access. Or maybe this
> should just be handled internally by the compositor? (Display a warning or
> something?)

apps should not have exclusive access. we're re-doing the whole horrid "install
colormap" thing from the x days of 256 color (or paletted/colormapped displays).

> Thanks,
> 
> [1]:
> https://github.com/swaywm/wlr-protocols/blob/master/unstable/wlr-gamma-control-unstable-v1.xml
> [2]: https://github.com/jonls/redshift/blob/master/README-colorramp
> 
> --
> Simon Ser
> https://emersion.fr
> 
> ___
> wayland-devel mailing list
> wayland-devel@lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/wayland-devel

-- 
- Codito, ergo sum - "I code, therefore I am" --
Carsten Haitzler - ras...@rasterman.com

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-04 Thread Simon Ser
On Monday, March 4, 2019 8:13 AM, Graeme Gill  wrote:
> And the current favorite is "blue light filter" effects, for which numerous
> applications are currently available. They tweak the white point
> of the display by arbitrarily modifying the hardware per channel LUTs.
> (i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)
>
> Such applications have their place for those who like the effect, but
> ideally such usage would not simply blow color management away.

FWIW wlroots has a protocol for this [1]. GNOME and KDE have this feature
directly integrated in their compositor.

> In order of desirability it would be nice to:
>
> 3) Have the hardware Luts restored after an application that uses them
>exits (i.e. like OS X handles it).

Agreed. This is done in our protocol and there's no such issue when builtin in
the compositor.

> 2) Implement virtual per channel LUTs, with the compositor combining them
>together in some way, and have some means of the color management 
> applications
>being aware when the display is being interfered with by another 
> application,
>so that the user can be warned that the color management state is invalid.

Is there a "good way" to combine multiple LUTs?

> 1) A color managed API that lets an application shift the display white point
>using chromatic adaptation, so that such blue light filter applications
>can operate more predictably, as well as some means of the color management
>applications being aware of when this is happening.

How should this look like? Disclaimer: I have no idea how these applications
work and I know nothing about color management.

I'm guessing this is a restriction of the "change the whole LUTs" API. Are there
any features the "blue light filter" app won't be able to implement when
switching to this API? Would the compositor part become complicated (judging
from [2] it seems different "blue light filter" apps may compute LUTs
differently)?

Since many compositors (GNOME, KDE, wlroots, maybe more) implement a way to
apply a "blue light filter", I think it's important to be able to notify color
management applications that they don't have exclusive access. Or maybe this
should just be handled internally by the compositor? (Display a warning or
something?)

Thanks,

[1]: 
https://github.com/swaywm/wlr-protocols/blob/master/unstable/wlr-gamma-control-unstable-v1.xml
[2]: https://github.com/jonls/redshift/blob/master/README-colorramp

--
Simon Ser
https://emersion.fr

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-03 Thread Graeme Gill
Chris Murphy wrote:

> A common offender were games. They'd try to access the video card LUT
> directly for effects, but then not reset it back to what it was,
> rather reset it to a hardwired assumption the game makes,

And the current favorite is "blue light filter" effects, for which numerous
applications are currently available. They tweak the white point
of the display by arbitrarily modifying the hardware per channel LUTs.
(i.e. f.lux, Redshift, SunsetScreen, Iris, Night Shift, Twilight etc.)

Such applications have their place for those who like the effect, but
ideally such usage would not simply blow color management away.

In order of desirability it would be nice to:

3) Have the hardware Luts restored after an application that uses them
   exits (i.e. like OS X handles it).

2) Implement virtual per channel LUTs, with the compositor combining them
   together in some way, and have some means of the color management 
applications
   being aware when the display is being interfered with by another application,
   so that the user can be warned that the color management state is invalid.

1) A color managed API that lets an application shift the display white point
   using chromatic adaptation, so that such blue light filter applications
   can operate more predictably, as well as some means of the color management
   applications being aware of when this is happening.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-01 Thread Adam Jackson
On Fri, 2019-03-01 at 12:10 +0200, Pekka Paalanen wrote:
> On Thu, 28 Feb 2019 18:28:33 -0700
> Chris Murphy  wrote:
> 
> > I'm curious how legacy applications including games used to manipulate
> > actual hardware LUT in a video card, if the application asked the
> > client to do it, in which case it still could do that?
> 
> Hi Chris,
> 
> right now, in no case.
> 
> It would probably be possible to enhance Xwayland (the X server) to
> implement whatever X11 offers as protocol for LUT manipulation, and
> forward that to a Wayland display server (compositor) using a new
> Wayland protocol extension where Xwayland attached that LUT to all
> Xwayland wl_surfaces. Then the Wayland compositor would apply the LUT
> per wl_surface.

X kinda has three mechanisms for this. The first one, that nobody
really uses, is setting the colormap for a DirectColor visual. The
second, which games typically use, is setting per-channel gamma
(implicitly for the whole screen) as single floating-point values with
the xf86vidmode extension. The third, which desktop environments
sometimes use to try to make distinct displays look similar, is setting
per-crtc gamma as 256 (or whatever) stops per channel with the RANDR
extension.

All of these are effectively the program specifying its transformation
to what it hopes is linear in device space. The sample server happens
to implement all three as global state, but that's an implementation
detail. It would be straightforward to give each Xwayland client the
illusion of complete control if we wanted.

- ajax

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-01 Thread Chris Murphy
On Fri, Mar 1, 2019 at 3:10 AM Pekka Paalanen  wrote:
>
> On Thu, 28 Feb 2019 18:28:33 -0700
> Chris Murphy  wrote:
>
> > I'm curious how legacy applications including games used to manipulate
> > actual hardware LUT in a video card, if the application asked the
> > client to do it, in which case it still could do that?
>
> Hi Chris,
>
> right now, in no case.

I made a typo.
s/client/kernel

Or has LUT manipulation only ever been done via X11?


> The approach already suggested from my side and other people, that is
> encoded in both of the two protocol proposals, is that the client
> describes the image content, which allows the compositor to do whatever
> the correct transformation is to each display. The client does not need
> to do anything per-output (unless it very much wants to). This enables
> not just what you mentioned, but also showing the same client pixel
> correctly on multiple different outputs at the same time.

OK super.


> > a. I've already done color management, I *really do* need deviceRGB
> > b. display this, its color space is _.
>
> Case b) is already in both of the protocol proposals.
>
> Case a) is in Niels' proposal, but I raised some issues with that. It is
> a very problematic case to implement in general, too, because the
> compositor is in some cases very likely to have to undo the color
> transformations the application already did to go back to a common
> blending space or to the spaces for other outputs.

Case a) is a subset of the calibration/characterization application's
requirement.

Even if it turns out the application tags its content with displayRGB,
thereby in effect getting a null transform, (or a null transform with
whatever quantization happens through 32bpc float intermediate color
image encoding), that's functionally a do not color manage deviceRGB
path.

> > Both types of applications exist. It might very well be reasonable to
> > say, yeah we're not going to support use case a.) Such smarter
> > applications are going to have to do their color management however
> > they want internally, and transform to a normalized color space like
> > P3 or Rec.2020 or opRGB and follow use case b.) where they tag all
> > content with that normalized color space.
>
> Right. We'll see. And measurement/calibration/characterisation
> applications are a third category completely different to the two
> above, by window management requirements if nothing else.

It is fair to keep track of, and distinguish a display path with:

1. no calibration and no/null transform;
2. calibration applied, but no/null transform;
3. calibration and transform applied.

The calibration application does need a means of ensuring explicitly
getting each of those. 1, is needed to figure out the uncorrected
state and hopefully give the user some guidance on knob settings via
OSD, and then to take meausurements to compute a corrective curve
typically going in the video card LUT or equivalent wherever else that
would go; 2, is needed to build an ICC profile for the display; and 3,
is needed for verifying the path.

An application doing color management internally only really needs
path 2. Nevertheless, that app's need is a subset of what's already
needed by an application that does display calibration and profiling.

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-03-01 Thread Pekka Paalanen
On Thu, 28 Feb 2019 18:28:33 -0700
Chris Murphy  wrote:

> On Thu, Feb 28, 2019 at 2:35 AM Pekka Paalanen  wrote:
> >
> > On Wed, 27 Feb 2019 13:47:07 -0700
> > Chris Murphy  wrote:
> >  
> > > On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  
> > > wrote:  
> > > >
> > > > there is a single, unambiguous answer on Wayland: the compositor owns
> > > > the pipeline. Therefore we won't have the kind of problems you describe
> > > > above.
> > > >
> > > > These are the very reasons I am against adding any kind of protocol
> > > > extension that would allow a client to directly touch the pipeline or
> > > > to bypass the compositor.  
> > >
> > > Well you need a client to do display calibration which necessarily
> > > means altering the video LUT (to linear) in order to do the
> > > measurements from which a correction curve is computed, and then that
> > > client needs to install that curve into the video LUT. Now, colord
> > > clearly has such capability, as it's applying vcgt tags in ICC
> > > profiles now. If colord can do it, then what prevents other clients
> > > from doing it?  
> >
> > Hi Chris,
> >
> > there is no need to expose hardware knobs like LUT etc. directly in
> > protocol even for measuring. We can have a special, privileged protocol
> > extension for measurement apps, where the measuring intent is explicit,
> > and the compositor can prepare the hardware correctly. This also avoids
> > updating the measurement apps to follow the latest hardware features
> > which the compositor might be using already. An old measurement app
> > could be getting wrong result because it didn't know how to reset a new
> > part in the pipeline that the compositor is using.
> >
> > Hence the compositor owns the pipeline at all times.
> >
> > Permanently setting the new pipeline parameters is compositor
> > configuration - you wouldn't want to have to run the measurement app on
> > every boot to just install the right parameters. Compositor
> > configuration is again compositor specific. The privileged protocol
> > extension could have a way to deliver the new output color profile to
> > the compositor, for the compositor to save and apply it with any
> > methods it happens to use. You cannot assume that the compositor will
> > actually program "the hardware LUT" to achieve that, since there are
> > many other ways to achieve the same and hardware capabilities vary.
> > Nowadays there is often much more than just one LUT. Furthermore, in my
> > recent reply to Niels' color management extension proposal I derived a
> > case from the proposal where a compositor would be forced to use an
> > identity LUT and instead do all transformation during rendering.
> >
> > Colord is not a client. Colord is currently called from a
> > weston plugin, the plugin has access to the compositor internal API to
> > set up a LUT. Colord cannot do it on its own.  
> 
> That all seems reasonable.
> 
> I'm curious how legacy applications including games used to manipulate
> actual hardware LUT in a video card, if the application asked the
> client to do it, in which case it still could do that?

Hi Chris,

right now, in no case.

It would probably be possible to enhance Xwayland (the X server) to
implement whatever X11 offers as protocol for LUT manipulation, and
forward that to a Wayland display server (compositor) using a new
Wayland protocol extension where Xwayland attached that LUT to all
Xwayland wl_surfaces. Then the Wayland compositor would apply the LUT
per wl_surface.

This would have nothing to do with the color management extensions now
being under discussion, unless Xwayland grew the capability of
manufacturing an ICC profile out of that game given LUT. I suspect that
could be inconvenient.

> Also I'm curious about the multiple display use case. I think it's
> quite a lot to ask a client to know about multiple transforms for
> multiple displays and which pixels to transform and how, based on
> which display those pixels are currently displayed on, and then
> somehow to intelligently cache this so it doesn't bog down the whole
> system. A use case in particular I'm thinking of is Firefox, where you
> really don't want to have to constantly do transforms of everything,
> every time a pixel scrolls away and vanishes, but when it reappears
> it's reappearing on a different display. And also you'd want the pixel
> to look correct from the very instant it appears on a different
> display and is correct upon appearing back on the original display or
> even looks correct in a split/mirrow window scenario, laptop display +
> projector.

The approach already suggested from my side and other people, that is
encoded in both of the two protocol proposals, is that the client
describes the image content, which allows the compositor to do whatever
the correct transformation is to each display. The client does not need
to do anything per-output (unless it very much wants to). This enables
not just what you mentioned, but also showing the 

Re: HDR support in Wayland/Weston

2019-02-28 Thread Chris Murphy
On Thu, Feb 28, 2019 at 2:35 AM Pekka Paalanen  wrote:
>
> On Wed, 27 Feb 2019 13:47:07 -0700
> Chris Murphy  wrote:
>
> > On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> > >
> > > there is a single, unambiguous answer on Wayland: the compositor owns
> > > the pipeline. Therefore we won't have the kind of problems you describe
> > > above.
> > >
> > > These are the very reasons I am against adding any kind of protocol
> > > extension that would allow a client to directly touch the pipeline or
> > > to bypass the compositor.
> >
> > Well you need a client to do display calibration which necessarily
> > means altering the video LUT (to linear) in order to do the
> > measurements from which a correction curve is computed, and then that
> > client needs to install that curve into the video LUT. Now, colord
> > clearly has such capability, as it's applying vcgt tags in ICC
> > profiles now. If colord can do it, then what prevents other clients
> > from doing it?
>
> Hi Chris,
>
> there is no need to expose hardware knobs like LUT etc. directly in
> protocol even for measuring. We can have a special, privileged protocol
> extension for measurement apps, where the measuring intent is explicit,
> and the compositor can prepare the hardware correctly. This also avoids
> updating the measurement apps to follow the latest hardware features
> which the compositor might be using already. An old measurement app
> could be getting wrong result because it didn't know how to reset a new
> part in the pipeline that the compositor is using.
>
> Hence the compositor owns the pipeline at all times.
>
> Permanently setting the new pipeline parameters is compositor
> configuration - you wouldn't want to have to run the measurement app on
> every boot to just install the right parameters. Compositor
> configuration is again compositor specific. The privileged protocol
> extension could have a way to deliver the new output color profile to
> the compositor, for the compositor to save and apply it with any
> methods it happens to use. You cannot assume that the compositor will
> actually program "the hardware LUT" to achieve that, since there are
> many other ways to achieve the same and hardware capabilities vary.
> Nowadays there is often much more than just one LUT. Furthermore, in my
> recent reply to Niels' color management extension proposal I derived a
> case from the proposal where a compositor would be forced to use an
> identity LUT and instead do all transformation during rendering.
>
> Colord is not a client. Colord is currently called from a
> weston plugin, the plugin has access to the compositor internal API to
> set up a LUT. Colord cannot do it on its own.

That all seems reasonable.

I'm curious how legacy applications including games used to manipulate
actual hardware LUT in a video card, if the application asked the
client to do it, in which case it still could do that?

Also I'm curious about the multiple display use case. I think it's
quite a lot to ask a client to know about multiple transforms for
multiple displays and which pixels to transform and how, based on
which display those pixels are currently displayed on, and then
somehow to intelligently cache this so it doesn't bog down the whole
system. A use case in particular I'm thinking of is Firefox, where you
really don't want to have to constantly do transforms of everything,
every time a pixel scrolls away and vanishes, but when it reappears
it's reappearing on a different display. And also you'd want the pixel
to look correct from the very instant it appears on a different
display and is correct upon appearing back on the original display or
even looks correct in a split/mirrow window scenario, laptop display +
projector.

At least on macOS and for the most part Windows, most applications
aren't color management aware, and just assume deviceRGB color for
everything; and at least on macOS by default, and Windows as an
option, it's possible for the window manager to substitute what is
really "legacy deviceRGB" for sRGB as an intermediate space and from
there properly do display compensation for pixels on whatever display
they appear on. Ergo, a display calibration app does need a way to
announce its ability so that its test chart isn't being assumed to be
sRGB (or whatever), and a smarter color managed application needs a
way of saying one of two things:
a. I've already done color management, I *really do* need deviceRGB
b. display this, its color space is _.

Both types of applications exist. It might very well be reasonable to
say, yeah we're not going to support use case a.) Such smarter
applications are going to have to do their color management however
they want internally, and transform to a normalized color space like
P3 or Rec.2020 or opRGB and follow use case b.) where they tag all
content with that normalized color space.

And all of this has an equivalent path and transform for printing, and
how to get sane output whether 

Re: HDR support in Wayland/Weston

2019-02-28 Thread Adam Jackson
On Wed, 2019-02-27 at 13:47 -0700, Chris Murphy wrote:
> On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> > there is a single, unambiguous answer on Wayland: the compositor owns
> > the pipeline. Therefore we won't have the kind of problems you describe
> > above.
> > 
> > These are the very reasons I am against adding any kind of protocol
> > extension that would allow a client to directly touch the pipeline or
> > to bypass the compositor.
> 
> Well you need a client to do display calibration which necessarily
> means altering the video LUT (to linear) in order to do the
> measurements from which a correction curve is computed, and then that
> client needs to install that curve into the video LUT. Now, colord
> clearly has such capability, as it's applying vcgt tags in ICC
> profiles now. If colord can do it, then what prevents other clients
> from doing it?

The wayland server is capable of knowing the process on the other end
of the socket, and only exposing the color management control protocol
to specifically blessed clients.

- ajax

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-28 Thread Pekka Paalanen
On Wed, 27 Feb 2019 13:47:07 -0700
Chris Murphy  wrote:

> On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> >
> > there is a single, unambiguous answer on Wayland: the compositor owns
> > the pipeline. Therefore we won't have the kind of problems you describe
> > above.
> >
> > These are the very reasons I am against adding any kind of protocol
> > extension that would allow a client to directly touch the pipeline or
> > to bypass the compositor.  
> 
> Well you need a client to do display calibration which necessarily
> means altering the video LUT (to linear) in order to do the
> measurements from which a correction curve is computed, and then that
> client needs to install that curve into the video LUT. Now, colord
> clearly has such capability, as it's applying vcgt tags in ICC
> profiles now. If colord can do it, then what prevents other clients
> from doing it?

Hi Chris,

there is no need to expose hardware knobs like LUT etc. directly in
protocol even for measuring. We can have a special, privileged protocol
extension for measurement apps, where the measuring intent is explicit,
and the compositor can prepare the hardware correctly. This also avoids
updating the measurement apps to follow the latest hardware features
which the compositor might be using already. An old measurement app
could be getting wrong result because it didn't know how to reset a new
part in the pipeline that the compositor is using.

Hence the compositor owns the pipeline at all times.

Permanently setting the new pipeline parameters is compositor
configuration - you wouldn't want to have to run the measurement app on
every boot to just install the right parameters. Compositor
configuration is again compositor specific. The privileged protocol
extension could have a way to deliver the new output color profile to
the compositor, for the compositor to save and apply it with any
methods it happens to use. You cannot assume that the compositor will
actually program "the hardware LUT" to achieve that, since there are
many other ways to achieve the same and hardware capabilities vary.
Nowadays there is often much more than just one LUT. Furthermore, in my
recent reply to Niels' color management extension proposal I derived a
case from the proposal where a compositor would be forced to use an
identity LUT and instead do all transformation during rendering.

Colord is not a client. Colord is currently called from a
weston plugin, the plugin has access to the compositor internal API to
set up a LUT. Colord cannot do it on its own.


Thanks,
pq


pgp4phzQA_c8v.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-27 Thread Chris Murphy
On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
>
> there is a single, unambiguous answer on Wayland: the compositor owns
> the pipeline. Therefore we won't have the kind of problems you describe
> above.
>
> These are the very reasons I am against adding any kind of protocol
> extension that would allow a client to directly touch the pipeline or
> to bypass the compositor.

Well you need a client to do display calibration which necessarily
means altering the video LUT (to linear) in order to do the
measurements from which a correction curve is computed, and then that
client needs to install that curve into the video LUT. Now, colord
clearly has such capability, as it's applying vcgt tags in ICC
profiles now. If colord can do it, then what prevents other clients
from doing it?


> If we had to support such old games that insist on playing with the
> video LUT (assuming that there is just one LUT, no CTM, etc.), we could
> have a Wayland extension that allows attaching the LUT to a wl_surface,
> and then the compositor would apply the LUT any way it wants but only
> to that one window while at the same time all the other windows would
> remain looking good regardless.

Understood.

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-27 Thread Pekka Paalanen
On Fri, 22 Feb 2019 14:18:44 -0700
Chris Murphy  wrote:

> On Fri, Feb 22, 2019 at 9:00 AM Pekka Paalanen  wrote:
> >
> > Hi Chris,
> >
> > that is some interesting background, but I feel like I didn't quite
> > catch the point.
> >
> > If the CRTC color management pipeline (LUT-CTM-LUT + maybe more) is
> > programmed according to the monitor's color profile, where would those
> > "conflicting video card LUTs" arise from?  
> 
> A common offender were games. They'd try to access the video card LUT
> directly for effects, but then not reset it back to what it was,
> rather reset it to a hardwired assumption the game makes, e.g. all
> displays have gamma 2.2 so we'll just do that! A secondary offender
> were display calibration programs, user would upgrade or get a
> competing program and now you've got two startup applets that apply
> conflicting LUTs and it may be deterministically wrong or it may be
> subject to a race condition as to which applet applies.
> 
> The later problem has mostly vanished with the advent of an OS API for
> reading the vcgt tag directly from the ICC profile set as the profile
> for a particular display.
> 
> So who owns the pipeline? If it's shared, then anyone can use it and
> not set it back the way they found it. Or alternatively if they're
> going to mess with that pipeline, to have a kind of "reset" API for
> the thing that ought to be mostly responsible for such a thing, e.g.
> colord.

Hi Chris,

there is a single, unambiguous answer on Wayland: the compositor owns
the pipeline. Therefore we won't have the kind of problems you describe
above.

These are the very reasons I am against adding any kind of protocol
extension that would allow a client to directly touch the pipeline or
to bypass the compositor.

If we had to support such old games that insist on playing with the
video LUT (assuming that there is just one LUT, no CTM, etc.), we could
have a Wayland extension that allows attaching the LUT to a wl_surface,
and then the compositor would apply the LUT any way it wants but only
to that one window while at the same time all the other windows would
remain looking good regardless.


Thanks,
pq


pgpD3nR_ZnUHG.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-22 Thread Chris Murphy
On Fri, Feb 22, 2019 at 9:00 AM Pekka Paalanen  wrote:
>
> Hi Chris,
>
> that is some interesting background, but I feel like I didn't quite
> catch the point.
>
> If the CRTC color management pipeline (LUT-CTM-LUT + maybe more) is
> programmed according to the monitor's color profile, where would those
> "conflicting video card LUTs" arise from?

A common offender were games. They'd try to access the video card LUT
directly for effects, but then not reset it back to what it was,
rather reset it to a hardwired assumption the game makes, e.g. all
displays have gamma 2.2 so we'll just do that! A secondary offender
were display calibration programs, user would upgrade or get a
competing program and now you've got two startup applets that apply
conflicting LUTs and it may be deterministically wrong or it may be
subject to a race condition as to which applet applies.

The later problem has mostly vanished with the advent of an OS API for
reading the vcgt tag directly from the ICC profile set as the profile
for a particular display.

So who owns the pipeline? If it's shared, then anyone can use it and
not set it back the way they found it. Or alternatively if they're
going to mess with that pipeline, to have a kind of "reset" API for
the thing that ought to be mostly responsible for such a thing, e.g.
colord.



-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-22 Thread Pekka Paalanen
On Mon, 18 Feb 2019 10:44:15 -0700
Chris Murphy  wrote:

> On Fri, Feb 1, 2019 at 3:43 AM Pekka Paalanen  wrote:
> >
> > On Thu, 31 Jan 2019 12:03:25 -0700
> > Chris Murphy  wrote:
> >  
> > > I'm pretty sure most every desktop environment and distribution have
> > > settled on colord as the general purpose service.
> > > https://github.com/hughsie/colord
> > > https://www.freedesktop.org/software/colord/  
> >
> > FWIW, Weston already has a small plugin to use colord. The only thing
> > it does to apply anything is to set the simplest form of the gamma
> > ramps.  
> 
> Short version:
> Having just briefly looked that code, my best guess is colord is
> probably reading a vcgt tag in the ICC profile for the display, and
> applying it to the video card LUT (or one of them anyway).
> 
> Super extra long version:
> In ancient times (two decades+) there was a clear separation between
> display calibration (change the device) and characterization (record
> its behavior). Calibration was a combination of resetting and fiddling
> with display controls like brightness and contrast, and then also
> leveraging the at best 8 bit per channel LUT in the video card to
> achieve the desired white point and tone curve per channel.
> Characterization, which results in an ICC profile, happens on top of
> that. The profile is valid only when the calibration is applied, both
> the knob fiddling part and the applicable LUT in the video card. The
> LUT information used to be kept in a separate file, and then circa 15
> years ago Apple started to embed this information into the ICC profile
> as the vcgt tag, and the operating system display manager reads that
> tag and applies it to the video card LUT prior to login time. This has
> become fairly widespread, even though I'm not finding vcgt in the
> published ICC v4.3 spec. But they do offer this document:
> www.color.org/groups/medical/displays/controllingVCGT.pdf
> 
> There are some test profiles that contain various vcgt tags here:
> http://www.brucelindbloom.com/index.html?Vcgt.html
> 
> You really must have a reliable central service everyone agrees on to
> apply such a LUT, and then also banning anything else from setting a
> conflicting LUT. Again in ancient times we had all sorts of problems
> with applications messing around with the LUT, and instead of reading
> it first and restoring it the same way, they just reset it to some
> default, thereby making the ICC profile invalid.
> 
> The primary reason, again historically, for setting the white point
> outside of software (ideally set correctly in the display itself; less
> ideal is using a video card LUT) is because mismatching white points
> are really distracting, it prevents proper adaptation, and therefore
> everything looks wrong. Ironically the color managed content is
> decently likely to look more wrong than non-color-managed content. Why
> would there be mismatching white points? Correct white point fully
> color managed content in an application window, but not any other
> application or the surrounding UI of the desktop environment.
> 
> Ergo, some kind of "calibration" of white point independent of the
> color management system. Sometimes this is just a preset in the
> display's on-screen menu. Getting the display white point in the ball
> park of target white point means a less aggressive LUT in the video
> card, or even ideally a linear LUT.
> 
> Alternatively, you decide you're going to have some master of all
> pixels. That's the concept of full display compensation, where every
> pixel is subject to color management transforms regardless of its
> source application, all normalized to a single intermediate color
> space. In theory if you throw enough bits at this intermediate space,
> you could forgo the video card LUT based calibration.
> 
> The next workflow gotcha, is multiple displays. In designing a color
> management system for an OS you have to decide if applications will
> have the option to display across multiple displays, each of which
> could have their own display profile.
> 
> I agree with Graeme that having different pipelines for calibration or
> characterization is asking for big trouble. The thing I worry about,
> is whether it's possible for each application to effectively have
> unique pipelines because they're all using different rendering
> libraries. The idea we'd have application specific characterization to
> account for each application pipeline just spells doom. The return of
> conflicting video card LUTs would be a nightmare.

Hi Chris,

that is some interesting background, but I feel like I didn't quite
catch the point.

If the CRTC color management pipeline (LUT-CTM-LUT + maybe more) is
programmed according to the monitor's color profile, where would those
"conflicting video card LUTs" arise from?


Thanks,
pq


pgpRCndMdpDMT.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list

Re: HDR support in Wayland/Weston

2019-02-18 Thread Chris Murphy
On Fri, Feb 1, 2019 at 3:43 AM Pekka Paalanen  wrote:
>
> On Thu, 31 Jan 2019 12:03:25 -0700
> Chris Murphy  wrote:
>
> > I'm pretty sure most every desktop environment and distribution have
> > settled on colord as the general purpose service.
> > https://github.com/hughsie/colord
> > https://www.freedesktop.org/software/colord/
>
> FWIW, Weston already has a small plugin to use colord. The only thing
> it does to apply anything is to set the simplest form of the gamma
> ramps.

Short version:
Having just briefly looked that code, my best guess is colord is
probably reading a vcgt tag in the ICC profile for the display, and
applying it to the video card LUT (or one of them anyway).

Super extra long version:
In ancient times (two decades+) there was a clear separation between
display calibration (change the device) and characterization (record
its behavior). Calibration was a combination of resetting and fiddling
with display controls like brightness and contrast, and then also
leveraging the at best 8 bit per channel LUT in the video card to
achieve the desired white point and tone curve per channel.
Characterization, which results in an ICC profile, happens on top of
that. The profile is valid only when the calibration is applied, both
the knob fiddling part and the applicable LUT in the video card. The
LUT information used to be kept in a separate file, and then circa 15
years ago Apple started to embed this information into the ICC profile
as the vcgt tag, and the operating system display manager reads that
tag and applies it to the video card LUT prior to login time. This has
become fairly widespread, even though I'm not finding vcgt in the
published ICC v4.3 spec. But they do offer this document:
www.color.org/groups/medical/displays/controllingVCGT.pdf

There are some test profiles that contain various vcgt tags here:
http://www.brucelindbloom.com/index.html?Vcgt.html

You really must have a reliable central service everyone agrees on to
apply such a LUT, and then also banning anything else from setting a
conflicting LUT. Again in ancient times we had all sorts of problems
with applications messing around with the LUT, and instead of reading
it first and restoring it the same way, they just reset it to some
default, thereby making the ICC profile invalid.

The primary reason, again historically, for setting the white point
outside of software (ideally set correctly in the display itself; less
ideal is using a video card LUT) is because mismatching white points
are really distracting, it prevents proper adaptation, and therefore
everything looks wrong. Ironically the color managed content is
decently likely to look more wrong than non-color-managed content. Why
would there be mismatching white points? Correct white point fully
color managed content in an application window, but not any other
application or the surrounding UI of the desktop environment.

Ergo, some kind of "calibration" of white point independent of the
color management system. Sometimes this is just a preset in the
display's on-screen menu. Getting the display white point in the ball
park of target white point means a less aggressive LUT in the video
card, or even ideally a linear LUT.

Alternatively, you decide you're going to have some master of all
pixels. That's the concept of full display compensation, where every
pixel is subject to color management transforms regardless of its
source application, all normalized to a single intermediate color
space. In theory if you throw enough bits at this intermediate space,
you could forgo the video card LUT based calibration.

The next workflow gotcha, is multiple displays. In designing a color
management system for an OS you have to decide if applications will
have the option to display across multiple displays, each of which
could have their own display profile.

I agree with Graeme that having different pipelines for calibration or
characterization is asking for big trouble. The thing I worry about,
is whether it's possible for each application to effectively have
unique pipelines because they're all using different rendering
libraries. The idea we'd have application specific characterization to
account for each application pipeline just spells doom. The return of
conflicting video card LUTs would be a nightmare.

--
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-07 Thread Kai-Uwe
Hello Chris,

Am 31.01.19 um 20:03 schrieb Chris Murphy:
> On Wed, Jan 30, 2019 at 10:54 PM Nautiyal, Ankit K
>  wrote:
>>  From where can the client-applications get the ICC profile files? Does
>> the client application manufacture it for a given color space and a
>> standard template?
> I'm pretty sure most every desktop environment and distribution have
> settled on colord as the general purpose service.


Chris is quite involved in _colord_ consultancy as a color management
__consultant__ and writes books about color management. But he do not
tell anywhere about these commercial interest clash. Are face down cards
common advertising practice in the US?

regards

Kai-Uwe Behrmann
-- 
www.oyranos.org - Color Management System project lead
www.behrmann.name - coding for various open source and commercial color
projects


PS: To make it clear, I am fine that different people speak up their
technical opinion, including about their own business products. As long
as it is publicly visible in a email and has technical benefit.

PPS: sorry for the late noice, and of course great to see the topic
becoming lively again


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-02-05 Thread Graeme Gill
Chris Murphy wrote:

Hi Chris,

> I'm pretty sure most every desktop environment and distribution have
> settled on colord as the general purpose service.
> https://github.com/hughsie/colord
> https://www.freedesktop.org/software/colord/

right, but colord is not needed to run X11 color management.
Using ArygyllCMS to load profiles on startup is a viable alternative.
There is also Oyranos.

The experience of trying to get better integration between
colord and color management tools like ArgyllCMS is one of
the things that informs my views on a Wayland color management
protocol. (i.e. it doesn't currently work, even though it
used to at one point.) Having a neutral and consistently
implemented API for color managed clients and color management
tools would be a very good thing.

> By default, colord will create a display profile on-the-fly, for
> attached displays, based on display provided EDID information.

Certainly a useful approach a system may take to providing default
display profiles.

> You'd want to evaluate the interfaces of Argyll CMS and lcms2; it's
> possible you'd use Argyll CMS for profile creation, and lcms2 as the
> transformation engine, for example.

ArgyllCMS's CMM probably isn't a choice for some compositor implementations,
due to its GPL licensing, and current lack of support for ICCV4. I don't think
that it has any noticeable speed advantage of lcms2 for 3 channel conversions
either, and lcms is much better integrated as a drop in CMM. Its plug in
architecture may be an advantage in implementing the HDR tweaks needed,
as well as those wishing to optimize compositor color management performance.

Cheers,
Graeme.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-02-05 Thread Graeme Gill
Pekka Paalanen wrote:
> Graeme Gill  wrote:

Hi,

>> I don't have any basis to voice an opinion as to which particular protocol
>> these different aspects would best fall into (wayland core or one of the
>> supplementary xdg protocols ? - I'm not clear on what the purpose of these
>> divisions is), I just know that they are intimately related, and development
>> of one without the other will risk mis-steps. I don't really understand
>> why you think they are orthogonal.

> color management related extensions would be neither Wayland core (in
> wayland.xml file) nor under the XDG umbrella (XDG is specific to
> desktops while color is not specific to desktops).

I understand that color management would be a Wayland extension protocol.
I'm not clear on whether color management tool support is specific to
desktops or not though.

> The Wayland upstream cannot force anyone to implement any
> extension. The upstream can offer design help, reviews, discussion
> forum and a repository to host extensions, which should make extensions
> popular among client software and suitable for more than one
> compositor. The pressure for someone to implement an extension only
> ever comes from the community: people wanting new things to work.

Sure.

> Color management related extensions would be several different
> extensions, each catering to its own scope. The one we are talking
> about here is for clients to be able to reasonably provide color
> managed content and enable the correct displaying of it under automatic
> adaptation to outputs. Other color management related extensions could
> allow e.g. measuring the color profiles of monitors, to be discussed at
> another time.

I disagree. They are two side of the same coin. They may well
be described by separate (but closely related) protocols, but
they should be designed and implemented together. This minimizes
the total amount of work involved, and ensures cohesion.

> Being able to provide color managed content implies knowing the color
> properties of the used hardware and images. It does not imply being
> able to measure the color properties:

Color properties of the display have to come from somewhere. Less work
overall, and a more secure result if the are measured in the way they
are intended to be.

> you don't re-measure the color
> properties continuously while using an application. Or if you actually
> do, we can cater for that use case when designing a measurement
> extension.

Users may choose to perform display calibration, profiling or verification
at any time. It may be once a year, once a month or once a day. They
may need to do it right now, just before they start a critical project,
or perhaps because they have altered some display setting (brightness,
contrast, color temperature, colorspace emulation mode.)

> I don't understand. How is a color profile, or say, an .icc file or
> whatever you use to store color profiles, dependent on the window
> system? Do you mean you cannot use the same profile data files on
> Windows and Linux X11?

It's unsafe to assume so. They may be identical, or they may not.
The conservative assumption amongst color critical users is that they
are not. As a color technologists I suspect they are often very
similar. Anecdotally people have reported differences.
(In the days of CRT's, they almost certainly differed with
different screen resolutions. Probably less so with modern
display technologies.)

In any case, expecting a user to boot up an alternate
operating system to do a color profile is pretty big ask,
and not something that is currently expected, and few will
be open to such an expectation. They will instead switch to
a system that doesn't require this of them.

> Or do you mean that the window system implementation adds its own
> unknown effect to the pixel values?

It's possible. The conservative assumption is to assume this might be
the case. ("conservative" in the sense of not wanting the uncertainty,
or having to waste time verifying that they are in fact identical.)

> If that is so, then the measuring
> environment on those systems is fundamentally broken to begin with.

Not at all. Any such processing is part and parcel of the effective
display characteristic. It has to be profiled as well, to end up
with a profile that is valid for that system.

>> It's unsafe to switch pixel pipelines in the process of characterization,
>> since it risks measuring transformations that happen/don't happen in
>> one context, and not in the other. It's also not something that
>> end users will put up with - if a system claims to support color management,
>> then it includes the color management tools necessary to setup,
>> maintain and verify its operation on a day to day basis.
> 
> That seems like an unrealistic assumption, it kind of precludes the
> whole idea of interoperability. If your assumption is held, then you
> would need to make separate measurements for e.g. every single pixel
> format × color space an application 

Re: HDR support in Wayland/Weston

2019-02-01 Thread Pekka Paalanen
On Thu, 31 Jan 2019 12:03:25 -0700
Chris Murphy  wrote:

> Hi Ankit,
> 
> 
> On Wed, Jan 30, 2019 at 10:54 PM Nautiyal, Ankit K
>  wrote:
> >
> > Hi Ole,
> >
> > I was going through the protocol you had proposed, and have some silly
> > questions, please pardon my ignorance.
> >
> >  From where can the client-applications get the ICC profile files? Does
> > the client application manufacture it for a given color space and a
> > standard template?  
> 
> I'm pretty sure most every desktop environment and distribution have
> settled on colord as the general purpose service.
> https://github.com/hughsie/colord
> https://www.freedesktop.org/software/colord/

FWIW, Weston already has a small plugin to use colord. The only thing
it does to apply anything is to set the simplest form of the gamma
ramps.


Thanks,
pq


pgpw9OyKyJhJd.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-31 Thread Chris Murphy
Hi Ankit,


On Wed, Jan 30, 2019 at 10:54 PM Nautiyal, Ankit K
 wrote:
>
> Hi Ole,
>
> I was going through the protocol you had proposed, and have some silly
> questions, please pardon my ignorance.
>
>  From where can the client-applications get the ICC profile files? Does
> the client application manufacture it for a given color space and a
> standard template?

I'm pretty sure most every desktop environment and distribution have
settled on colord as the general purpose service.
https://github.com/hughsie/colord
https://www.freedesktop.org/software/colord/

By default, colord will create a display profile on-the-fly, for
attached displays, based on display provided EDID information. There
can be buggy EDID, and in practice there are no display updates to fix
it after the product ships. Whether this is sanity checked prior to
building a profile, or later, depends on the policy of the platform at
the time the buggy EDID is encountered. Where it's sanity checked also
varies: it could be the platform specific profile manager e.g. colord;
the color management module (CMM) which is the engine that uses
profiles to perform color space transforms; or it could happen at the
application level.

Profiles are sometimes packaged with applications, sometimes stand
alone. Argyll CMS includes a bunch of standardized RGB profiles,
Ghostscript includes another set, and colord includes some as well.

You're definitely best off deferring to tools that specialize in
creating ICC profiles rather than building them yourself.


> Or the compositor needs to store .icc files for each of the
> color-spaces, which the clients can use.
>
> Also, are there already libraries that can be user to parse the .icc files?
>
> I can see some recommended by ICC, like SampleICC, Argyll etc, is there
> something which suits our case better?

You'd want to evaluate the interfaces of Argyll CMS and lcms2; it's
possible you'd use Argyll CMS for profile creation, and lcms2 as the
transformation engine, for example.

http://www.littlecms.com/

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-31 Thread Nautiyal, Ankit K

Thanks for the clarification Ole.

Regards,

Ankit


On 1/31/2019 2:22 PM, Niels Ole Salscheider wrote:

Hi Ankit,

please find my answers below.

Am Donnerstag, 31. Januar 2019, 06:54:03 CET schrieb Nautiyal, Ankit K:

Hi Ole,

I was going through the protocol you had proposed, and have some silly
questions, please pardon my ignorance.

  From where can the client-applications get the ICC profile files? Does
the client application manufacture it for a given color space and a
standard template?

Or the compositor needs to store .icc files for each of the
color-spaces, which the clients can use.

The compositor will know about a few widely used ICC profiles. In my proposal
this is sRGB but we could also add e. g. a HDR profile. If the client wants to
use one of these profiles it can just tell the compositor and use it. If the
client wants to provide the surface in any other color space it has to create
a matching color profile from an ICC profile. For that the client passes a
file descriptor to the profile to the compositor. The file descriptor might
correspond to a file on the hard disk (where it was installed by the client/
toolkit/some third party) or to some temporary file created by the client. In
the latter case the data might for example come from the embedded profile of
an image or it might have been composed by the client.


Also, are there already libraries that can be user to parse the .icc files?

I can see some recommended by ICC, like SampleICC, Argyll etc, is there
something which suits our case better?

Yes, there are open source libraries to handle .icc files. You already
mentioned ArgyllCMS. There is also LittleCMS which is easy to use and enough
for a lot of usecases.

Best regards,

Ole


Regards,

Ankit

On 1/10/2019 9:31 PM, Niels Ole Salscheider wrote:

Hello,

on a first glance this sounds sensible. Would it work well with the last
color management protocol proposal that I made or do you see issues
there? We could add REC2020 as another predefined profile.

https://lists.freedesktop.org/archives/wayland-devel/2017-January/032769.h
tml

I think the last proposal was mostly sane and usable for everybody, but
there was not much interest afterwards. However, there was a lot of
discussion with wishes from different sides that went into this. The
relevant mailing list threads are the following, but you have to follow
the discussion over the next months:

https://lists.freedesktop.org/archives/wayland-devel/2016-November/031728.
html
https://lists.freedesktop.org/archives/wayland-devel/2014-March/013951.ht
ml

Best regards,
Ole

Am Donnerstag, 10. Januar 2019, 16:02:18 CET schrieb Sharma, Shashank:

Hello All,

This mail is to propose a design for enabling HDR support in
Wayland/Weston stack, using display engine capabilities, and get more
feedback and input from community.
Here are few points (you might already know these), about HDR
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the
right luminance at the display device.

   - The metadata can be of two type 1. static 2. dynamic . For

simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer
function) information, which is a curve (like SRGB gamma curve). One
popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709
specifications.

- For accurate blending in display engines, we need to make sure

following:

   - All the buffers are in same colorspace (Rec 709 or Rec 2020)
   - All the buffers are liner (gamma/EOTF removed)
   - All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of
a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the
two buffers, one with video (HDR) and one subtitles (SDR)

- Client passes following information to the compositor:
- The actual buffers
- Their colorspace infromation, BT2020 for HDR buffer, REC709 for

SDR buffer (planning to add a new protocol extension for this)

- The HDR metadata of the content (planning to add new protocol

for this)

- Compositors actions:
  - Reads the End display's HDR capabilities from display EDID. Assume

its an HDR HDMI monitor.

  - Compositor tone maps every view's framebuffer to match tone of end

display, applying a libVA filter. In this example:
   - The SDR subtitles frame will go through SDR to HDR tone

mapping (called S2H)

   - The HDR video frame will go through

Re: HDR support in Wayland/Weston

2019-01-31 Thread Niels Ole Salscheider
Hi Ankit,

please find my answers below.

Am Donnerstag, 31. Januar 2019, 06:54:03 CET schrieb Nautiyal, Ankit K:
> Hi Ole,
> 
> I was going through the protocol you had proposed, and have some silly
> questions, please pardon my ignorance.
> 
>  From where can the client-applications get the ICC profile files? Does
> the client application manufacture it for a given color space and a
> standard template?
> 
> Or the compositor needs to store .icc files for each of the
> color-spaces, which the clients can use.

The compositor will know about a few widely used ICC profiles. In my proposal 
this is sRGB but we could also add e. g. a HDR profile. If the client wants to 
use one of these profiles it can just tell the compositor and use it. If the 
client wants to provide the surface in any other color space it has to create 
a matching color profile from an ICC profile. For that the client passes a 
file descriptor to the profile to the compositor. The file descriptor might 
correspond to a file on the hard disk (where it was installed by the client/
toolkit/some third party) or to some temporary file created by the client. In 
the latter case the data might for example come from the embedded profile of 
an image or it might have been composed by the client.

> Also, are there already libraries that can be user to parse the .icc files?
> 
> I can see some recommended by ICC, like SampleICC, Argyll etc, is there
> something which suits our case better?

Yes, there are open source libraries to handle .icc files. You already 
mentioned ArgyllCMS. There is also LittleCMS which is easy to use and enough 
for a lot of usecases.

Best regards,

Ole

> Regards,
> 
> Ankit
> 
> On 1/10/2019 9:31 PM, Niels Ole Salscheider wrote:
> > Hello,
> > 
> > on a first glance this sounds sensible. Would it work well with the last
> > color management protocol proposal that I made or do you see issues
> > there? We could add REC2020 as another predefined profile.
> > 
> > https://lists.freedesktop.org/archives/wayland-devel/2017-January/032769.h
> > tml
> > 
> > I think the last proposal was mostly sane and usable for everybody, but
> > there was not much interest afterwards. However, there was a lot of
> > discussion with wishes from different sides that went into this. The
> > relevant mailing list threads are the following, but you have to follow
> > the discussion over the next months:
> > 
> > https://lists.freedesktop.org/archives/wayland-devel/2016-November/031728.
> > html
> > https://lists.freedesktop.org/archives/wayland-devel/2014-March/013951.ht
> > ml
> > 
> > Best regards,
> > Ole
> > 
> > Am Donnerstag, 10. Januar 2019, 16:02:18 CET schrieb Sharma, Shashank:
> >> Hello All,
> >> 
> >> This mail is to propose a design for enabling HDR support in
> >> Wayland/Weston stack, using display engine capabilities, and get more
> >> feedback and input from community.
> >> Here are few points (you might already know these), about HDR
> >> framebuffers, videos and displays:
> >> - HDR content/buffers are composed in REC2020 colorspace, with bit depth
> >> 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> >> - HDR content come with their own Metadata to be applied to get the
> >> right luminance at the display device.
> >> 
> >>   - The metadata can be of two type 1. static 2. dynamic . For
> >> 
> >> simplicity, this solution is focusing on static HDR only (HDR10 standard)
> >> - HDR content also provide its supported EOTF (electro optical transfer
> >> function) information, which is a curve (like SRGB gamma curve). One
> >> popular EOTF is PQ(ST2084).
> >> - HDR capable displays mention their EOTF and HDR metadata support
> >> information in EDID CEA-861-G blocks.
> >> - Normal SRGB buffers are composed in SRGB color space following REC709
> >> specifications.
> >> 
> >> - For accurate blending in display engines, we need to make sure 
following:
> >>   - All the buffers are in same colorspace (Rec 709 or Rec 2020)
> >>   - All the buffers are liner (gamma/EOTF removed)
> >>   - All the buffers are tone mapped in same zone (HDR or SDR)
> >> 
> >> Please refer to the block diagram below, which presents a simple case of
> >> a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
> >> buffers as subtitles. The subsystem looks and works like this:
> >> - A client decodes the buffer (using FFMpeg for example) and gets the
> >> two buffers, one with video (HDR) and one subtitles (SDR)
&g

Re: HDR support in Wayland/Weston

2019-01-30 Thread Nautiyal, Ankit K

Hi Ole,

I was going through the protocol you had proposed, and have some silly 
questions, please pardon my ignorance.


From where can the client-applications get the ICC profile files? Does 
the client application manufacture it for a given color space and a 
standard template?


Or the compositor needs to store .icc files for each of the 
color-spaces, which the clients can use.


Also, are there already libraries that can be user to parse the .icc files?

I can see some recommended by ICC, like SampleICC, Argyll etc, is there 
something which suits our case better?


Regards,

Ankit



On 1/10/2019 9:31 PM, Niels Ole Salscheider wrote:

Hello,

on a first glance this sounds sensible. Would it work well with the last color
management protocol proposal that I made or do you see issues there?
We could add REC2020 as another predefined profile.

https://lists.freedesktop.org/archives/wayland-devel/2017-January/032769.html

I think the last proposal was mostly sane and usable for everybody, but there
was not much interest afterwards. However, there was a lot of discussion with
wishes from different sides that went into this. The relevant mailing list
threads are the following, but you have to follow the discussion over the next
months:

https://lists.freedesktop.org/archives/wayland-devel/2016-November/031728.html
https://lists.freedesktop.org/archives/wayland-devel/2014-March/013951.html

Best regards,
Ole

Am Donnerstag, 10. Januar 2019, 16:02:18 CET schrieb Sharma, Shashank:

Hello All,

This mail is to propose a design for enabling HDR support in
Wayland/Weston stack, using display engine capabilities, and get more
feedback and input from community.
Here are few points (you might already know these), about HDR
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the
right luminance at the display device.
  - The metadata can be of two type 1. static 2. dynamic . For
simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer
function) information, which is a curve (like SRGB gamma curve). One
popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709
specifications.
- For accurate blending in display engines, we need to make sure following:
  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
  - All the buffers are liner (gamma/EOTF removed)
  - All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of
a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the
two buffers, one with video (HDR) and one subtitles (SDR)
- Client passes following information to the compositor:
   - The actual buffers
   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for
SDR buffer (planning to add a new protocol extension for this)
   - The HDR metadata of the content (planning to add new protocol
for this)

- Compositors actions:
 - Reads the End display's HDR capabilities from display EDID. Assume
its an HDR HDMI monitor.
 - Compositor tone maps every view's framebuffer to match tone of end
display, applying a libVA filter. In this example:
  - The SDR subtitles frame will go through SDR to HDR tone
mapping (called S2H)
  - The HDR video frame will go through HDR to HDR tone mapping
(called H2H) if the HDR capabilities of monitor and content are different.
  - Now both the buffers and the monitor are in the same tone
mapped range.
  - As the end display is HDR capable, and one of the content frame
is HDR, the compositor will prepare all other planes for color space
conversion (CSC) from REC709->REC2020 using plane CSC property.
  - As the CSC and blending should be done in liner space, compositor
will also use plane level degamma to make the buffers linear.
  - These actions will make sure that, during blending:
  - All the buffers are in same colorspace (REC2020)
  - All the buffers are linear
  - All the buffers are tone mapped (HDR)
  - The plane level color properties patch, for DRM can be found
here: https://patchwork.freedesktop.org/series/30875/
  - Now, in order to re-apply the HDR curve, compositor will apply
CRTC level gamma, so that the output buffer is non-linear again.
  - To pass the output HDR information to kernel, so that it can
create and send AVI-info-frames to HDMI, compositor will set Connector
HDR metadata property.
  - C

Re: HDR support in Wayland/Weston

2019-01-29 Thread Pekka Paalanen
On Tue, 29 Jan 2019 16:02:16 +1100
Graeme Gill  wrote:

> Pekka Paalanen wrote:
> 

> >>> Yes, a compositor must implement all that, but this is now slipping to
> >>> the topic of calibration, which is very much off-scope for today. We
> >>> only want to consider how applications produce and provide content with
> >>> specific color properties for now.  
> 
> >> It's a necessary part of the picture. There's not much point in
> >> moving ahead with Color Management support if there is no
> >> easy means of creating display profiles to populate it with. So in
> >> terms of practical implementation I see them going hand in hand.  
> > 
> > They may go hand-in-hand in practise, but protocol-wise I still see
> > them as two completely orthogonal features, and we need to split the
> > work into manageable chunks anyway.  
> 
> I don't have any basis to voice an opinion as to which particular protocol
> these different aspects would best fall into (wayland core or one of the
> supplementary xdg protocols ? - I'm not clear on what the purpose of these
> divisions is), I just know that they are intimately related, and development
> of one without the other will risk mis-steps. I don't really understand
> why you think they are orthogonal.

Hi Graeme,

color management related extensions would be neither Wayland core (in
wayland.xml file) nor under the XDG umbrella (XDG is specific to
desktops while color is not specific to desktops).

Color management related extensions, just like presentation-time and
viewporter, would live in wayland-protocols repository where they are
easily shared by all compositor and client software that wants to use
them. The Wayland upstream cannot force anyone to implement any
extension. The upstream can offer design help, reviews, discussion
forum and a repository to host extensions, which should make extensions
popular among client software and suitable for more than one
compositor. The pressure for someone to implement an extension only
ever comes from the community: people wanting new things to work.

Color management related extensions would be several different
extensions, each catering to its own scope. The one we are talking
about here is for clients to be able to reasonably provide color
managed content and enable the correct displaying of it under automatic
adaptation to outputs. Other color management related extensions could
allow e.g. measuring the color profiles of monitors, to be discussed at
another time.

Being able to provide color managed content implies knowing the color
properties of the used hardware and images. It does not imply being
able to measure the color properties: you don't re-measure the color
properties continuously while using an application. Or if you actually
do, we can cater for that use case when designing a measurement
extension.

> > Surely the color characterisation of a monitor is not specific to a
> > window system?  
> 
> Very much so, since in practice this process relies on running an application
> that makes use of the window system for its operation.

I don't understand. How is a color profile, or say, an .icc file or
whatever you use to store color profiles, dependent on the window
system? Do you mean you cannot use the same profile data files on
Windows and Linux X11?

Or do you mean that the window system implementation adds its own
unknown effect to the pixel values? If that is so, then the measuring
environment on those systems is fundamentally broken to begin with.

> > While we still don't have a good solution to
> > measurements in Wayland, people can measure their monitors on Xorg, or
> > even Windows, I hope.  
> 
> It's unsafe to switch pixel pipelines in the process of characterization,
> since it risks measuring transformations that happen/don't happen in
> one context, and not in the other. It's also not something that
> end users will put up with - if a system claims to support color management,
> then it includes the color management tools necessary to setup,
> maintain and verify its operation on a day to day basis.

That seems like an unrealistic assumption, it kind of precludes the
whole idea of interoperability. If your assumption is held, then you
would need to make separate measurements for e.g. every single pixel
format × color space an application is going to provide content to the
compositor in, multiplied by all the different ways a compositor can
make use of the graphics card hardware to produce the video signal on
that single system, and you get to redo this every time you update any
software component.

I would rather aim towards a design where you measure your monitor (and
not the window system), and the compositor will take care of keeping
that measured profile valid regardless of how the compositor chooses to
use the hardware. I would put responsibility to the compositor and
hardware drivers to function correctly, instead of requiring every
application needing to verify the whole system every time it starts 

Re: HDR support in Wayland/Weston

2019-01-28 Thread Graeme Gill
Pekka Paalanen wrote:



> yes, this is a good concern. I think we might be needing an extension
> to tell the client which output to prioritise for. We already have an
> analogue of that in the Presentation-time extension: a wl_surface's
> timings can only be synchronised to one output at a time, so the
> presentation feedback events tell the client which output it was
> synchronised to on each frame.

Hello Pekka,

yes, that sounds like a very similar concern, and one that perhaps
could be leveraged or imitated.

(I was being a overly pessimistic in expecting only one output
 to be accurately color rendered - in fact a color aware application
 should be able to do color accurate rendering to all surfaces
 that are mapped to a single output if that output has an associated
 color profile. It would just be surfaces mapped to more than
 one output that would be compromised on all but one output.)

> I cannot imagine a case where the output for timings and the output for
> color would be different, but I also probably would not re-use the
> Presentation-time extension here. Instead, we would probably need to
> handle it in the color related extensions explicitly.

OK.

> Your explanation further below is very enlightening on what "rendering"
> is.

Thanks.

> The explanation of "intents" is an eye-opener to me. I guess the only
> thing I had been imagining was the Absolute Colorimetric method - that
> colors could be absolute.

Absolute Colorimetric is less commonly used that Relative Colorimetric
style intents, since people normally expect the image white to map to
device white, no matter what that may be for a particular device.
Absolute colorimetric is typically has more specialized use,
such as side by side proofing, spot colors etc.

>>> Yes, a compositor must implement all that, but this is now slipping to
>>> the topic of calibration, which is very much off-scope for today. We
>>> only want to consider how applications produce and provide content with
>>> specific color properties for now.

>> It's a necessary part of the picture. There's not much point in
>> moving ahead with Color Management support if there is no
>> easy means of creating display profiles to populate it with. So in
>> terms of practical implementation I see them going hand in hand.
> 
> They may go hand-in-hand in practise, but protocol-wise I still see
> them as two completely orthogonal features, and we need to split the
> work into manageable chunks anyway.

I don't have any basis to voice an opinion as to which particular protocol
these different aspects would best fall into (wayland core or one of the
supplementary xdg protocols ? - I'm not clear on what the purpose of these
divisions is), I just know that they are intimately related, and development
of one without the other will risk mis-steps. I don't really understand
why you think they are orthogonal.

> Surely the color characterisation of a monitor is not specific to a
> window system?

Very much so, since in practice this process relies on running an application
that makes use of the window system for its operation.

> While we still don't have a good solution to
> measurements in Wayland, people can measure their monitors on Xorg, or
> even Windows, I hope.

It's unsafe to switch pixel pipelines in the process of characterization,
since it risks measuring transformations that happen/don't happen in
one context, and not in the other. It's also not something that
end users will put up with - if a system claims to support color management,
then it includes the color management tools necessary to setup,
maintain and verify its operation on a day to day basis.

From a development point of view it's working in a an unnecessary
difficult manner, since there will be nothing to exercise the full
range of color management capabilities, or measurably verify
the implementation against.

> Or even better: using a measuring tool running
> directly on DRM KMS, which ensures the tool gets absolute control over
> the display hardware.

Definitely the last thing you want, since that leaves a gaping
hole for the hardware to be setup differently between profiling
and use of the profile, thereby making the profile invalid.
The safe approach is to maintain as much similarity as possible
between system setup while profiling and when the profile
is used. Ensuring that a profile remains valid from
profiling to use is a key consideration in color management,
and without this, it all falls to pieces.

Completely made up example: Say the display is actually
a TV running over HDMI, and (for whatever reason) the
compositor sets up the link in YCbCr mode rather than RGB,
and as luck would have it, the video card encodes the framebuffer
RGB using REC 709 1150/60/2:1 standard, while the TV manufacturer
didn't bother supporting the various YCbCr encodings and hard
coded it to decode Rec601 YCbCr. So there is a hidden color
change going on when Wayland is running as well as some
minor quantization 

Re: HDR support in Wayland/Weston

2019-01-21 Thread Pekka Paalanen
On Fri, 18 Jan 2019 09:05:53 +0530
"Sharma, Shashank"  wrote:

> On 1/17/2019 5:33 PM, Pekka Paalanen wrote:
> > On Wed, 16 Jan 2019 09:25:06 +0530
> > "Sharma, Shashank"  wrote:
> >  
> >> On 1/14/2019 6:51 PM, Pekka Paalanen wrote:  
> >>> On Thu, 10 Jan 2019 20:32:18 +0530
> >>> "Sharma, Shashank"  wrote:
> >>>  
> >>>> Hello All,
> >>>>
> >>>> This mail is to propose a design for enabling HDR support in
> >>>> Wayland/Weston stack, using display engine capabilities, and get more
> >>>> feedback and input from community.  

> > In summary, rather than dynamically allow or not allow, a compositor
> > needs to live by its promise on what works. It cannot pull the rug from
> > under a client by suddenly hiding the window or showing it corrupted.
> > That is the baseline in behaviour.  
> Makes a lot of sense, and sounds like a stable design too.
> As per this design policy, while coming up, compositor can analyze the 
> static environment conditions like:
> - HW support for HDR
> - Kernel support for HDR
> 
> and based on these two it can decide to advertise the HDR capabilities 
> via the protocol, and once it does so, it has to make sure that it lives 
> up-to expectation.
> Now, the only variable in the environment is a hot-pluggable monitor, 
> which might/might not support HDR playback, and that needs to be handled 
> at runtime by:
> - Doing H2S tone mapping, if monitor can't support HDR playback.
> - and REC2020->REC709 conversion using CSC.
> - using GL blending as fallback if we are not able to prepare a plane 
> only state.
> 
> Does it seem like correct interpretation of your suggestions ?

Yes, it does.

Mind, in the future Weston may very well also support GPU hotplug for
additional sinks, which means that the kernel and graphics card support
becomes variable as well. But handling that is no different from
handling monitor variability.

> > Then we can have some mechanisms in place to inform clients about
> > changed conditions, like sending HDR content becomes useful or stops
> > being useful. The first mechanism here is the wl_surface.enter/leave
> > events: if the wl_surface enters the first HDR output, or leaves the
> > last HDR output, the client will know and may adapt if it wants to.  
> Again, sounds like a good idea.

As discussed with Graeme, the mechanism here might well be the
"prioritised output" protocol yet to be invented.


> > This means that you need to avoid interface name conflicts between
> > weston and wayland-protocols extensions.  
> Got it, I think you are suggesting something like the way how we 
> implemented aspect-ratio protocol in weston.

Well, aspect ratio did not have any protocol in the end. :-)


Thanks,
pq


pgp9qrDzyR38P.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-21 Thread Pekka Paalanen
On Fri, 18 Jan 2019 19:02:01 +1100
Graeme Gill  wrote:

> Pekka Paalanen wrote:
> 
> Hi,
> 
> > If a wl_surface straddles multiple outputs simultaneously, then
> > wl_surface.enter/leave events indicate the surface being on all those
> > outputs at the same time. The client is expected to take all the
> > entered outputs into consideration when it chooses how to render its
> > image. For HiDPI, this usually means taking the maximum output scale
> > from that set of outputs. The compositor will then automatically
> > (down)scale for the other outputs accordingly.  
> 
> right, I was guessing this is how it works. So an obvious heuristic
> for HiDPI is to render to the highest density output, on the basis
> that the compositor down-sampling is the least detrimental to quality.
> 
> A wrinkle in using the same idea for Color Management is that such
> a heuristic may not be as clear. For the case of a window being moved from
> one output to another, then either a last to leave or first to enter
> might make sense. If (as I presume) surfaces are multiply mapped
> to achieve mirroring or similar (picture in picture ?), then I'm not so
> sure how the most important display can be determined heuristically.
> Even in the case of a projector, I could imagine under some circumstances
> the projected color accuracy is secondary, and other circumstances it is
> most important.

Hi Graeme,

yes, this is a good concern. I think we might be needing an extension
to tell the client which output to prioritise for. We already have an
analogue of that in the Presentation-time extension: a wl_surface's
timings can only be synchronised to one output at a time, so the
presentation feedback events tell the client which output it was
synchronised to on each frame.

I cannot imagine a case where the output for timings and the output for
color would be different, but I also probably would not re-use the
Presentation-time extension here. Instead, we would probably need to
handle it in the color related extensions explicitly.

> > Yes and no. Yes, we do and should let clients know what kind of outputs
> > their contents will be shown on. However, we will in any case need the
> > compositor to be able to do the full and correct conversion from what
> > ever the client has submitted to what is correct for a specific output,
> > because nothing guarantees that those two would always match.  
> 
> I don't think that's technically possible, because it's analogous to
> Wayland taking on the rendering.

Your explanation further below is very enlightening on what "rendering"
is.

> But I also don't think it's a necessary
> either, because as long as the client can do it's rendering for the
> output that is most quality sensitive, then it's enough that the compositor
> can do a less perfect transform between the display or colorspace that was
> rendered to and the display it is actually appearing on. This is a much 
> simpler
> proposition than full general color management since you can assume that
> all buffers are display-like and only have 3 channels, and that simple
> to define color transform intents will be sufficient. Similarly to the HiDPI
> case, the visual quality may jump slightly if the surface is re-rendered
> for its actual output, but on most displays the difference shouldn't terribly
> obvious unless you were looking for it.

Yes, I think we found the same page here. :-)

> Some nice properties of this approach are that it provides the usual
> mechanism for an application (like a Color Management profiling app.)
> to set device values using the null transform trick (setting source
> colorspace the same as destination. If the two profiles are the same,
> color conversion should be skipped by the CMM/compositor.)
> A second property is that it would be possible to set a default
> source colorspace for clients that don't know about color management
> (i.e. sRGB). This allows a "color managed desktop" using existing window
> managers and applications, solving the problems of using wide gamut or HDR
> displays running in their full mode.

Yes, indeed.

> > I'm not sure I understand what you mean. Above you seemed to agree with
> > this. Maybe I should emphasise the "if it has to"? That is, when for
> > whatever reason, the client provided content is not directly suitable
> > for the output it will be displayed on.  
> 
> OK, a detail I seem to be having difficulty conveying is the
> difference between doing a bulk conversion between two colorspaces,
> and rendering in a color aware fashion. Example :- consider rendering
> a document stored in a sophisticated page description language
> such as PDF, PS or XPF. These are color aware formats, and in general
> pages may be composed of multiple different graphic elements that can
> have their color described in a multitude of different ways. Some may
> be in device dependent spaces such as RGB. Some may be in CMYK.
> Some may be in device independent CIE based spaces such as L*a*b*.
> 

Re: HDR support in Wayland/Weston

2019-01-18 Thread Kai-Uwe
Am 18.01.19 um 09:08 schrieb Graeme Gill:> Maybe rendering specifically
for one output is sufficient
> as long as the secondary displays (however that is determined!) look
> OK with a fallback conversion by the compositor.

Currently the first or main monitor is the primary rendering target. As
long as it is switchable, that's fine with me.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-18 Thread Graeme Gill
Sharma, Shashank wrote:

Hi,

> Yes, this is very accurate. We are planning to add a protocol extension, 
> which will allow
> a client to pass the buffers colorspace information to the compositor. And 
> compositor
> already has HW's color capabilities (drm properties per plane and CRTC), and 
> monitor color
> capabilities (EDID). So if compositor gets all of this, with some good 
> policies, it might
> be able to take call on colorspace accurately.

my impression from past feedback is that EDID display characterization as a rule
is not to be relied on, at least not if any degree of accuracy is desired. Maybe
the manufacturers have got better.

But I can sympathize with the desire of not getting bogged down in
color management stuff if you are just trying to get something
vaguely reasonable displayed on HDR.

> Correct, in fact, we would be using these same CRTC channels to apply the 
> EOTFs. If we go
> through the REC2020/2023 spec carefully, we realize that EOTF/OETF curves are 
> nothing but
> gamma/degamma correction curves for HDR monitors while they are displaying a 
> much wider
> gamut, like BT2020 or DCI-P3. I had written one drm-compositor based 
> implementation of
> sample color management using these CRTC/Plane color properties being exposed 
> by DRM, for
> accurate blending, but now I want to go through and understand Niels's color 
> management
> solution first.

Sure. If you just take the HDR monitors as being nominally HDR10 or similar, 
then you can
do a by-the-spec conversion and get something working.

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-18 Thread Graeme Gill
Adam Jackson wrote:

Hi,

> This isn't necessarily true. The server is free to just draw a black
> rectangle (or nothing) instead if the image doesn't match the target
> colorspace. If you want to handle the case of cloned outputs or
> crossing output borders, let the client attach one image per output
> colorspace if it wants, and let the server send the client events to
> indicate when it should start or stop drawing to a particular
> colorspace. You need that complexity in the client's renderer anyway,
> why add it to the server's?

yes, that crossed my mind. I'm wondering if in practice that is
overkill though. Maybe rendering specifically for one output is sufficient
as long as the secondary displays (however that is determined!) look
OK with a fallback conversion by the compositor. Multiple optimized
renderings per surface could always be added to the hybrid/fallback scheme
at a latter point ?

Cheers,
Graeme Gill.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-18 Thread Graeme Gill
Pekka Paalanen wrote:

Hi,

> If a wl_surface straddles multiple outputs simultaneously, then
> wl_surface.enter/leave events indicate the surface being on all those
> outputs at the same time. The client is expected to take all the
> entered outputs into consideration when it chooses how to render its
> image. For HiDPI, this usually means taking the maximum output scale
> from that set of outputs. The compositor will then automatically
> (down)scale for the other outputs accordingly.

right, I was guessing this is how it works. So an obvious heuristic
for HiDPI is to render to the highest density output, on the basis
that the compositor down-sampling is the least detrimental to quality.

A wrinkle in using the same idea for Color Management is that such
a heuristic may not be as clear. For the case of a window being moved from
one output to another, then either a last to leave or first to enter
might make sense. If (as I presume) surfaces are multiply mapped
to achieve mirroring or similar (picture in picture ?), then I'm not so
sure how the most important display can be determined heuristically.
Even in the case of a projector, I could imagine under some circumstances
the projected color accuracy is secondary, and other circumstances it is
most important.

> This scheme also means that the compositor does not necessarily need to
> wait for a client to render when the outputs suddenly change. It knows
> how to transform the existing image for new outputs already. The visual
> quality may jump afterwards when the client catches up, but there is no
> window blinking in and out of existence.

Right, I took this as a particular design aim in Wayland.

> Yes and no. Yes, we do and should let clients know what kind of outputs
> their contents will be shown on. However, we will in any case need the
> compositor to be able to do the full and correct conversion from what
> ever the client has submitted to what is correct for a specific output,
> because nothing guarantees that those two would always match.

I don't think that's technically possible, because it's analogous to
Wayland taking on the rendering. But I also don't think it's a necessary
either, because as long as the client can do it's rendering for the
output that is most quality sensitive, then it's enough that the compositor
can do a less perfect transform between the display or colorspace that was
rendered to and the display it is actually appearing on. This is a much simpler
proposition than full general color management since you can assume that
all buffers are display-like and only have 3 channels, and that simple
to define color transform intents will be sufficient. Similarly to the HiDPI
case, the visual quality may jump slightly if the surface is re-rendered
for its actual output, but on most displays the difference shouldn't terribly
obvious unless you were looking for it.

Some nice properties of this approach are that it provides the usual
mechanism for an application (like a Color Management profiling app.)
to set device values using the null transform trick (setting source
colorspace the same as destination. If the two profiles are the same,
color conversion should be skipped by the CMM/compositor.)
A second property is that it would be possible to set a default
source colorspace for clients that don't know about color management
(i.e. sRGB). This allows a "color managed desktop" using existing window
managers and applications, solving the problems of using wide gamut or HDR
displays running in their full mode.

> This makes no use of the monitor HDR capability, but it would start
> with the most important feature: backward compatibility. Adding the
> Wayland protocol to allow clients to reasonably produce and submit HDR
> content could come after this as a next phase.

Yes, that would be a good thing to get out of it.

> I'm not sure I understand what you mean. Above you seemed to agree with
> this. Maybe I should emphasise the "if it has to"? That is, when for
> whatever reason, the client provided content is not directly suitable
> for the output it will be displayed on.

OK, a detail I seem to be having difficulty conveying is the
difference between doing a bulk conversion between two colorspaces,
and rendering in a color aware fashion. Example :- consider rendering
a document stored in a sophisticated page description language
such as PDF, PS or XPF. These are color aware formats, and in general
pages may be composed of multiple different graphic elements that can
have their color described in a multitude of different ways. Some may
be in device dependent spaces such as RGB. Some may be in CMYK.
Some may be in device independent CIE based spaces such as L*a*b*.
Some may be spot colors. Each may be specified to be rendered with
specific color transform intents. For instance, a screen capture image may
be specified in RGB and rendered with a Relative Colorimetric Intent,
where the white points are matched and colors otherwise directly 

Re: HDR support in Wayland/Weston

2019-01-17 Thread Sharma, Shashank

Regards

Shashank


On 1/17/2019 5:33 PM, Pekka Paalanen wrote:

On Wed, 16 Jan 2019 09:25:06 +0530
"Sharma, Shashank"  wrote:


On 1/14/2019 6:51 PM, Pekka Paalanen wrote:

On Thu, 10 Jan 2019 20:32:18 +0530
"Sharma, Shashank"  wrote:


Hello All,

This mail is to propose a design for enabling HDR support in
Wayland/Weston stack, using display engine capabilities, and get more
feedback and input from community.

*snip*


I understand your aim is to leverage display hardware capabilities to
the fullest, but we must also consider hardware that lacks some or all
of the conversion/mapping/other features while the monitor is well
HDR-capable. We also need to consider what happens when a monitor is
not HDR-capable or is somehow lacking. OTOH, whether a compositor
implements HDR support at all would be obvious in the advertised
Wayland globals and pixel formats.

Very valid point. We have given good thoughts on how to handle such
scenarios when we are getting into, and we can come-up with some kind of
compositor policies which will decide if HDR video playback should be
allowed or not, depending on the combination of Content type, SW
stack(compositor and kernel), HW capabilities and connected Monitor
capabilities. A sample such policy may look like (also attached as txt
file, just in case if this table gets distorted):

++--+
|Content |SW (C|K)|HW |Monitor   | HDR Playback

*clip*

Talking in terms of "allowed" and "not allowed" sounds very much like
we would be needing "complicated" Wayland protocol to let applications
fail gracefully at runtime, letting them know dynamically when things
would work or not work. I believe we could do much simpler in protocol
terms as follows:

Does a compositor advertise HDR support extensions at all?

- This would depend on the compositor implementation obviously, cannot
   advertise anything without.

- Optionally, it could depend on the graphics card hardware/driver
   capabilities: if there is no card that could support HDR, then there
   is no reason advertise HDR support through Wayland, because it would
   always fall back to conversion to SDR. However, note that GPU hotplug
   might be a thing, which might bring HDR support later at runtime.

- Third, optionally again, a compositor might choose to not advertise
   HDR support if it knows it will never had a HDR-capable monitor
   attached. This is much longer stretch, and probably only for embedded
   devices you cannot plug arbitrary monitors to.

Once the Wayland HDR-related extensions have been advertised to clients
at runtime, taking them away will be hard. You may want to consider to
never revoke the extension interfaces if they have once been published
in the lifetime of a compositor instance, because revoking Wayland
globals has some caveats, mostly around clients still using HDR
extensions until they actually notice the compositor wants to redact
them. It can be made to work, but I'm not sure what the benefit would
be.

So, once a compositor advertises the extensions, they have to keep on
working at all times. Specifically this means, that if a client has
submitted a frame in HDR, and the compositor suddenly loses the ability
to physically display HDR, e.g. the only HDR monitor gets unplugged and
only SDR monitors remain, the compositor must still be able to show the
window that has HDR content lingering. So converting HDR to SDR
on-demand is a mandatory feature.

The allowed vs. not allowed is only applicable with respect to what
capabilities the compositor has advertised.

A client is not "allowed" to submit HDR content if the compositor does
not expose the Wayland extensions. Actually this is not about allowing,
but being able to submit HDR content at all: if the interfaces are not
advertised, a client simply has no interface to poke at.

Pixel formats, color spaces, and so on are more interesting. The
compositor should advertise what it supports by enumerating them
explicitly or saying what description formats it supports. Then a
client cannot use anything outside of those; if it attempts to, that
will be a fatal protocol error, not a recoverable failure.

If a client does everything according to what a compositor advertises,
it must "work": the compositor must be able to consume the client
content regardless of what will happen next, e.g. with monitor
hot-unplug, or scenegraph changing such that overlay plane is no longer
usable. This is why the fallback path through GL-renderer must exist,
and it must be able to do HDR->SDR mapping, and so on.

In summary, rather than dynamically allow or not allow, a compositor
needs to live by its promise on what works. It cannot pull the rug from
under a client by suddenly hiding the window or showing it corrupted.
That is the baseline in behaviour.

Makes a lot of sense, and sounds like a stable desig

Re: HDR support in Wayland/Weston

2019-01-17 Thread Pekka Paalanen
On Tue, 15 Jan 2019 14:14:53 -0500
Adam Jackson  wrote:

> On Tue, 2019-01-15 at 11:30 +0200, Pekka Paalanen wrote:
> > On Tue, 15 Jan 2019 13:47:07 +1100
> > Graeme Gill  wrote:
> >   
> > > If done in the composer, it would need to render the graphic elements to
> > > the output DPI / convert the source colorspace to the output colorspace.
> > > But the composer would need the code to do rendering / convert colorspaces
> > > (as well as being told what the graphic elements / source colorspace is),
> > > and this is not the role Wayland has - that's the responsibility of the
> > > client, so instead Wayland makes it possible for the client to know what 
> > > DPI
> > > it is rendering to. The analogous facility for CM is for the client to 
> > > know
> > > what output colorspace it is rendering for.  
> > 
> > Yes and no. Yes, we do and should let clients know what kind of outputs
> > their contents will be shown on. However, we will in any case need the
> > compositor to be able to do the full and correct conversion from what
> > ever the client has submitted to what is correct for a specific output,
> > because nothing guarantees that those two would always match.  
> 
> This isn't necessarily true. The server is free to just draw a black
> rectangle (or nothing) instead if the image doesn't match the target
> colorspace.

Hi Adam,

in my opinion it is not acceptable in a Wayland compositor.

Wayland protocol does not say anything about the final visual, so
showing a corrupted window (black, invisible, whatever) is not a
protocol violation per se, but I still think it would be a buggy
compositor if I can observe it.

At least Weston should do better. OTOH, a specialised compositor running
in a fixed environment and very specific use cases might be able to
manage with showing black, because no intended use case would make the
automatic conversion necessary. The protocol shouldn't care, both
implementations should be possible.

If no-one ever triggers a bug, does the bug need fixing?

> If you want to handle the case of cloned outputs or
> crossing output borders, let the client attach one image per output
> colorspace if it wants, and let the server send the client events to
> indicate when it should start or stop drawing to a particular
> colorspace. You need that complexity in the client's renderer anyway,
> why add it to the server's?

We could do that, but then it would not allow to blend different
windows correctly together. It would only work for completely opaque
windows.

The increase in memory usage would also be significant. The decrease in
overall system performance might be too, though that is not obvious.

> > One wl_surface on multiple outputs is an obvious case where one buffer
> > rendered by a client cannot match all the outputs it is shown on. The
> > other case is transitions between outputs, where we cannot have the
> > compositor wait for the client to re-draw with new color parameters.  
> 
> Honestly I think of this as an implementation issue?

Yes, it very much is an implementation issue.

> If we take the
> above multiple-images approach, then if it's my compositor I just omit
> drawing actors onto any output where there isn't an image for that
> colorspace, because I am comfortable saying any further latency is the
> client's renderer's problem. Someone else's compositor might try to
> bend the existing image to the output colorspace until the client has
> caught up, believing close-but-wrong color is better than visible
> absence of color. I could see an argument for either implementation
> depending on the environment, and I don't really see why the protocol
> spec should require one or the other.

That is correct.

The protocol spec will not require one way or another. The protocol
spec only needs to enable both ways.


Thanks,
pq


pgplxe9mbzLQi.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-17 Thread Pekka Paalanen
On Tue, 15 Jan 2019 13:19:00 +0100
Niels Ole Salscheider  wrote:

> Am Dienstag, 15. Januar 2019, 10:30:14 CET schrieb Pekka Paalanen:

> > Yes and no. Yes, we do and should let clients know what kind of outputs
> > their contents will be shown on. However, we will in any case need the
> > compositor to be able to do the full and correct conversion from what
> > ever the client has submitted to what is correct for a specific output,
> > because nothing guarantees that those two would always match.
> > 
> > One wl_surface on multiple outputs is an obvious case where one buffer
> > rendered by a client cannot match all the outputs it is shown on. The
> > other case is transitions between outputs, where we cannot have the
> > compositor wait for the client to re-draw with new color parameters.  
> 
> I think the last proposal of a color management protocol that we discussed 
> does that. It contains the device link profiles and it also allows the client 
> to query the profile of wl_outputs. With that, an application can display 
> accurate colors in nearly every situation, even on multiple screens 
> simultaneously. But still the compositor can do it's best to provide a good 
> output in some corner cases (e. g. when a new screen is activated and the 
> application has not rendered a new frame yet). Once the application reacts to 
> that change the output will be perfect again.

Sounds good! Sorry, I have not had a chance to refresh my memory on that
proposal, but I have high hopes that it will fit the HDR use case
nicely.


Thanks,
pq


pgpGHijxErVWI.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-17 Thread Pekka Paalanen
On Wed, 16 Jan 2019 09:25:06 +0530
"Sharma, Shashank"  wrote:

> On 1/14/2019 6:51 PM, Pekka Paalanen wrote:
> > On Thu, 10 Jan 2019 20:32:18 +0530
> > "Sharma, Shashank"  wrote:
> >
> >> Hello All,
> >>
> >> This mail is to propose a design for enabling HDR support in
> >> Wayland/Weston stack, using display engine capabilities, and get more
> >> feedback and input from community.

*snip*

> > I understand your aim is to leverage display hardware capabilities to
> > the fullest, but we must also consider hardware that lacks some or all
> > of the conversion/mapping/other features while the monitor is well
> > HDR-capable. We also need to consider what happens when a monitor is
> > not HDR-capable or is somehow lacking. OTOH, whether a compositor
> > implements HDR support at all would be obvious in the advertised
> > Wayland globals and pixel formats.

> Very valid point. We have given good thoughts on how to handle such 
> scenarios when we are getting into, and we can come-up with some kind of 
> compositor policies which will decide if HDR video playback should be 
> allowed or not, depending on the combination of Content type, SW 
> stack(compositor and kernel), HW capabilities and connected Monitor 
> capabilities. A sample such policy may look like (also attached as txt 
> file, just in case if this table gets distorted):
> 
> ++--+
> |Content |SW (C|K)|HW |Monitor   | HDR Playback 

*clip*

Talking in terms of "allowed" and "not allowed" sounds very much like
we would be needing "complicated" Wayland protocol to let applications
fail gracefully at runtime, letting them know dynamically when things
would work or not work. I believe we could do much simpler in protocol
terms as follows:

Does a compositor advertise HDR support extensions at all?

- This would depend on the compositor implementation obviously, cannot
  advertise anything without.

- Optionally, it could depend on the graphics card hardware/driver
  capabilities: if there is no card that could support HDR, then there
  is no reason advertise HDR support through Wayland, because it would
  always fall back to conversion to SDR. However, note that GPU hotplug
  might be a thing, which might bring HDR support later at runtime.

- Third, optionally again, a compositor might choose to not advertise
  HDR support if it knows it will never had a HDR-capable monitor
  attached. This is much longer stretch, and probably only for embedded
  devices you cannot plug arbitrary monitors to.

Once the Wayland HDR-related extensions have been advertised to clients
at runtime, taking them away will be hard. You may want to consider to
never revoke the extension interfaces if they have once been published
in the lifetime of a compositor instance, because revoking Wayland
globals has some caveats, mostly around clients still using HDR
extensions until they actually notice the compositor wants to redact
them. It can be made to work, but I'm not sure what the benefit would
be.

So, once a compositor advertises the extensions, they have to keep on
working at all times. Specifically this means, that if a client has
submitted a frame in HDR, and the compositor suddenly loses the ability
to physically display HDR, e.g. the only HDR monitor gets unplugged and
only SDR monitors remain, the compositor must still be able to show the
window that has HDR content lingering. So converting HDR to SDR
on-demand is a mandatory feature.

The allowed vs. not allowed is only applicable with respect to what
capabilities the compositor has advertised.

A client is not "allowed" to submit HDR content if the compositor does
not expose the Wayland extensions. Actually this is not about allowing,
but being able to submit HDR content at all: if the interfaces are not
advertised, a client simply has no interface to poke at.

Pixel formats, color spaces, and so on are more interesting. The
compositor should advertise what it supports by enumerating them
explicitly or saying what description formats it supports. Then a
client cannot use anything outside of those; if it attempts to, that
will be a fatal protocol error, not a recoverable failure.

If a client does everything according to what a compositor advertises,
it must "work": the compositor must be able to consume the client
content regardless of what will happen next, e.g. with monitor
hot-unplug, or scenegraph changing such that overlay plane is no longer
usable. This is why the fallback path through GL-renderer must exist,
and it must be able to do HDR->SDR mapping, and so on.

In summary, rather than dynamically allow or not allow, a compositor
needs to live by its promise on what works. It cannot pull the rug from
under

Re: HDR support in Wayland/Weston

2019-01-17 Thread Harish Krupo
Hi Arnaud,

Thank you for the comments, please find mine inline.

Arnaud Vrac  writes:

> On Thu, Jan 17, 2019 at 4:26 AM Sharma, Shashank
>  wrote:
>>
>> > The proposal is missing many important bits like negotiation of the
>> > supported output features with the client, double buffering the new
>> > colorspace related surface properties, using more of the hardware
>> > capabilities, performance issues, etc...
>>
>> > Also, the added protocols are
>> > probably too simple as far as color management is concerned.
>> Agree, there are two reasons for that:
>> - This proposal is a very high level design focusing the changes
>> required only to drive HDR video playback, in the real implementation,
>> you would see many of those mentioned. I think its too early to talk
>> about performance as we are still in design stage.
>> - As we have been discussing in parallel threads, HDR is too big a
>> feature, and we don't want to add too much of code in a single shot, and
>> create unwanted regressions and maintenance nightmares, rather, the aim
>> is to create small, modular, scalable, easy to review and test kind of
>> feature set, which might be targeting a very specific area, and
>> gradually complete this feature.
>>
>> But I would like to hear more about double buffering of the new
>> colorspace related surface properties if you can please elaborate more ?
>
> The colorspace related properties should be applied atomically when
> commiting the wl_surface. This is not done in Ville's patches, so
> there might be some rendering glitches when changing the colorspace
> while the surface is displayed.
>

Yes, both the colospace property and the hdr metadata for a surface are
double buffered and are applied only on wl_surface.commit. This should
be clear once we start posting our patches.

Thank you
Regards
Harish Krupo
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-17 Thread Arnaud Vrac
On Thu, Jan 17, 2019 at 4:26 AM Sharma, Shashank
 wrote:
>
> Hello Arnaud
>
> Thanks for your comments, mine inline.
>
> Regards
> Shashank
> On 1/17/2019 6:38 AM, Arnaud Vrac wrote:
> > On Thu, Jan 10, 2019 at 4:02 PM Sharma, Shashank
> >  wrote:
> >> Hello All,
> >>
> >> This mail is to propose a design for enabling HDR support in 
> >> Wayland/Weston stack, using display engine capabilities, and get more 
> >> feedback and input from community.
> >> Here are few points (you might already know these), about HDR 
> >> framebuffers, videos and displays:
> >> - HDR content/buffers are composed in REC2020 colorspace, with bit depth 
> >> 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> >> - HDR content come with their own Metadata to be applied to get the right 
> >> luminance at the display device.
> >>  - The metadata can be of two type 1. static 2. dynamic . For 
> >> simplicity, this solution is focusing on static HDR only (HDR10 standard)
> >> - HDR content also provide its supported EOTF (electro optical transfer 
> >> function) information, which is a curve (like SRGB gamma curve). One 
> >> popular EOTF is PQ(ST2084).
> >> - HDR capable displays mention their EOTF and HDR metadata support 
> >> information in EDID CEA-861-G blocks.
> >> - Normal SRGB buffers are composed in SRGB color space following REC709 
> >> specifications.
> >> - For accurate blending in display engines, we need to make sure following:
> >>  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
> >>  - All the buffers are liner (gamma/EOTF removed)
> >>  - All the buffers are tone mapped in same zone (HDR or SDR)
> >>
> >> Please refer to the block diagram below, which presents a simple case of a 
> >> HDR P010 movie playback, with HDR buffers as video buffers, and SDR 
> >> buffers as subtitles. The subsystem looks and works like this:
> >> - A client decodes the buffer (using FFMpeg for example) and gets the two 
> >> buffers, one with video (HDR) and one subtitles (SDR)
> >> - Client passes following information to the compositor:
> >>   - The actual buffers
> >>   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for 
> >> SDR buffer (planning to add a new protocol extension for this)
> >>   - The HDR metadata of the content (planning to add new protocol for 
> >> this)
> >>
> >> - Compositors actions:
> >> - Reads the End display's HDR capabilities from display EDID. Assume 
> >> its an HDR HDMI monitor.
> >> - Compositor tone maps every view's framebuffer to match tone of end 
> >> display, applying a libVA filter. In this example:
> >>  - The SDR subtitles frame will go through SDR to HDR tone mapping 
> >> (called S2H)
> >>  - The HDR video frame will go through HDR to HDR tone mapping 
> >> (called H2H) if the HDR capabilities of monitor and content are different.
> >>  - Now both the buffers and the monitor are in the same tone 
> >> mapped range.
> >>  - As the end display is HDR capable, and one of the content frame is 
> >> HDR, the compositor will prepare all other planes for color space 
> >> conversion (CSC) from REC709->REC2020 using plane CSC property.
> >>  - As the CSC and blending should be done in liner space, compositor 
> >> will also use plane level degamma to make the buffers linear.
> >>  - These actions will make sure that, during blending:
> >>  - All the buffers are in same colorspace (REC2020)
> >>  - All the buffers are linear
> >>  - All the buffers are tone mapped (HDR)
> >>  - The plane level color properties patch, for DRM can be found 
> >> here: https://patchwork.freedesktop.org/series/30875/
> >>  - Now, in order to re-apply the HDR curve, compositor will apply CRTC 
> >> level gamma, so that the output buffer is non-linear again.
> >>  - To pass the output HDR information to kernel, so that it can create 
> >> and send AVI-info-frames to HDMI, compositor will set Conn

Re: HDR support in Wayland/Weston

2019-01-16 Thread Sharma, Shashank

Hello Arnaud

Thanks for your comments, mine inline.

Regards
Shashank
On 1/17/2019 6:38 AM, Arnaud Vrac wrote:

On Thu, Jan 10, 2019 at 4:02 PM Sharma, Shashank
 wrote:

Hello All,

This mail is to propose a design for enabling HDR support in Wayland/Weston 
stack, using display engine capabilities, and get more feedback and input from 
community.
Here are few points (you might already know these), about HDR framebuffers, 
videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth 
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the right 
luminance at the display device.
 - The metadata can be of two type 1. static 2. dynamic . For simplicity, 
this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer 
function) information, which is a curve (like SRGB gamma curve). One popular 
EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support information 
in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709 
specifications.
- For accurate blending in display engines, we need to make sure following:
 - All the buffers are in same colorspace (Rec 709 or Rec 2020)
 - All the buffers are liner (gamma/EOTF removed)
 - All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of a HDR 
P010 movie playback, with HDR buffers as video buffers, and SDR buffers as 
subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the two 
buffers, one with video (HDR) and one subtitles (SDR)
- Client passes following information to the compositor:
  - The actual buffers
  - Their colorspace infromation, BT2020 for HDR buffer, REC709 for SDR 
buffer (planning to add a new protocol extension for this)
  - The HDR metadata of the content (planning to add new protocol for this)

- Compositors actions:
- Reads the End display's HDR capabilities from display EDID. Assume its an 
HDR HDMI monitor.
- Compositor tone maps every view's framebuffer to match tone of end 
display, applying a libVA filter. In this example:
 - The SDR subtitles frame will go through SDR to HDR tone mapping 
(called S2H)
 - The HDR video frame will go through HDR to HDR tone mapping (called 
H2H) if the HDR capabilities of monitor and content are different.
 - Now both the buffers and the monitor are in the same tone mapped 
range.
 - As the end display is HDR capable, and one of the content frame is HDR, the 
compositor will prepare all other planes for color space conversion (CSC) from 
REC709->REC2020 using plane CSC property.
 - As the CSC and blending should be done in liner space, compositor will 
also use plane level degamma to make the buffers linear.
 - These actions will make sure that, during blending:
 - All the buffers are in same colorspace (REC2020)
 - All the buffers are linear
 - All the buffers are tone mapped (HDR)
 - The plane level color properties patch, for DRM can be found here: 
https://patchwork.freedesktop.org/series/30875/
 - Now, in order to re-apply the HDR curve, compositor will apply CRTC 
level gamma, so that the output buffer is non-linear again.
 - To pass the output HDR information to kernel, so that it can create and 
send AVI-info-frames to HDMI, compositor will set Connector HDR metadata 
property.
 - Code for the same can be found here: 
https://patchwork.freedesktop.org/series/25091/
 - And they will ever live happily after :).

Please provide inputs, feedbacks and suggestions for this design and plan, so 
that we can improve out half cooked solution, and start sending the patches.

  +--+ +---+
  | SDR Buffer subtitles   | HDR Buffer video
  | (REC  709 colorsp) | (REC 2020 colorsp |
  |  | |   |
  +---+--+ +---+---+
  ||
  ||
  ||
   +--v---v+
 +--+
   |   Compositor: v   |
 | LibVA|
   |   - assigns views to overlays 
+-> Tone mapping |
   |   - prepare plane/CRTC color properties   
<-+ SDR to HDR   |
   | for linear blending in display|
 | H

Re: HDR support in Wayland/Weston

2019-01-16 Thread Arnaud Vrac
On Thu, Jan 10, 2019 at 4:02 PM Sharma, Shashank
 wrote:
>
> Hello All,
>
> This mail is to propose a design for enabling HDR support in Wayland/Weston 
> stack, using display engine capabilities, and get more feedback and input 
> from community.
> Here are few points (you might already know these), about HDR framebuffers, 
> videos and displays:
> - HDR content/buffers are composed in REC2020 colorspace, with bit depth 
> 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> - HDR content come with their own Metadata to be applied to get the right 
> luminance at the display device.
> - The metadata can be of two type 1. static 2. dynamic . For simplicity, 
> this solution is focusing on static HDR only (HDR10 standard)
> - HDR content also provide its supported EOTF (electro optical transfer 
> function) information, which is a curve (like SRGB gamma curve). One popular 
> EOTF is PQ(ST2084).
> - HDR capable displays mention their EOTF and HDR metadata support 
> information in EDID CEA-861-G blocks.
> - Normal SRGB buffers are composed in SRGB color space following REC709 
> specifications.
> - For accurate blending in display engines, we need to make sure following:
> - All the buffers are in same colorspace (Rec 709 or Rec 2020)
> - All the buffers are liner (gamma/EOTF removed)
> - All the buffers are tone mapped in same zone (HDR or SDR)
>
> Please refer to the block diagram below, which presents a simple case of a 
> HDR P010 movie playback, with HDR buffers as video buffers, and SDR buffers 
> as subtitles. The subsystem looks and works like this:
> - A client decodes the buffer (using FFMpeg for example) and gets the two 
> buffers, one with video (HDR) and one subtitles (SDR)
> - Client passes following information to the compositor:
>  - The actual buffers
>  - Their colorspace infromation, BT2020 for HDR buffer, REC709 for SDR 
> buffer (planning to add a new protocol extension for this)
>  - The HDR metadata of the content (planning to add new protocol for this)
>
> - Compositors actions:
>- Reads the End display's HDR capabilities from display EDID. Assume its 
> an HDR HDMI monitor.
>- Compositor tone maps every view's framebuffer to match tone of end 
> display, applying a libVA filter. In this example:
> - The SDR subtitles frame will go through SDR to HDR tone mapping 
> (called S2H)
> - The HDR video frame will go through HDR to HDR tone mapping (called 
> H2H) if the HDR capabilities of monitor and content are different.
> - Now both the buffers and the monitor are in the same tone mapped 
> range.
> - As the end display is HDR capable, and one of the content frame is HDR, 
> the compositor will prepare all other planes for color space conversion (CSC) 
> from REC709->REC2020 using plane CSC property.
> - As the CSC and blending should be done in liner space, compositor will 
> also use plane level degamma to make the buffers linear.
> - These actions will make sure that, during blending:
> - All the buffers are in same colorspace (REC2020)
> - All the buffers are linear
> - All the buffers are tone mapped (HDR)
> - The plane level color properties patch, for DRM can be found here: 
> https://patchwork.freedesktop.org/series/30875/
> - Now, in order to re-apply the HDR curve, compositor will apply CRTC 
> level gamma, so that the output buffer is non-linear again.
> - To pass the output HDR information to kernel, so that it can create and 
> send AVI-info-frames to HDMI, compositor will set Connector HDR metadata 
> property.
> - Code for the same can be found here: 
> https://patchwork.freedesktop.org/series/25091/
> - And they will ever live happily after :).
>
> Please provide inputs, feedbacks and suggestions for this design and plan, so 
> that we can improve out half cooked solution, and start sending the patches.
>
>  +--+ +---+
>  | SDR Buffer subtitles   | HDR Buffer video
>  | (REC  709 colorsp) | (REC 2020 colorsp |
>  |  | |   |
>  +---+--+ +---+---+
>  ||
>  ||
>  ||
>   +--v---v+   
>   +--+
>   |   Compositor: v   |   
>   | LibVA|
>

Re: HDR support in Wayland/Weston

2019-01-15 Thread Sharma, Shashank

Hello Graeme,

Thanks for your inputs and comment, please find mine inline.

Regards
Shashank
On 1/11/2019 12:55 PM, Graeme Gill wrote:

Sharma, Shashank wrote:

Hi,

While I'm sure you could hard code various color space assumptions into
such an implementation (and perhaps this is a pretty reasonable way
of doing a proof of concept), it's not a good long term solution,
and could end up being something of a millstone. What's missing
is any serous Color Management in Wayland. It's a bigger project
to fix that, but HDR would then be able to slot into a much more usable
framework.
I agree, honestly, HDR might be the biggest consumer of color 
management. We can very well use this stack to drive color management 
design. But the only worry is, when to target too big and too generic, 
the actual purpose gets diluted and we get stuck into long chain of mail 
communication. So probably the most important thing for us, as a team, 
would be to break this implementation into small measurable steps which 
slowly targets respective areas of Weston development, keeping the end 
goal alive.

- HDR content/buffers are composed in REC2020 colorspace, with bit depth 
10/12/16 BPC.
Some of the popular formats are P010,P012,P016.

While REC2020 based HDR colorspaces are very popular, they aren't the only ones 
out there.
Agree. My sole purpose of proposing this design was to talk to people, 
and come up with a small, scalable target use case, which we can slowly 
expand and shape into a bigger, more generic framework, over the period 
of time and maturity. So the target was to start with one of the broadly 
used cases, and try to expand this range, into all possible 
formats/cases without regressing into existing framework.

- Normal SRGB buffers are composed in SRGB color space following REC709 
specifications.

As far as I'm aware (I haven't been closely monitoring this mailing
list since the last rather long and unsatisfactory discussion about color
management), Wayland works in display dependent colorspace, since there
is no facility for it to know how to convert from anything else to the
display space (i.e. no knowledge of display profiles so it doesn't
know what sRGB is). In all other computer graphic display systems, it's
up to the client to be informed about each display colorspace is, and
to do color conversion to the display space either itself, or by using
operating system libraries. The operating system provides the display
profile information to allow this. As far as I was informed, Wayland
is architected in such a way that this is not possible, since clients
have no knowledge of which display the pixels they send will end up on.
Yes, this is very accurate. We are planning to add a protocol extension, 
which will allow a client to pass the buffers colorspace information to 
the compositor. And compositor already has HW's color capabilities (drm 
properties per plane and CRTC), and monitor color capabilities (EDID). 
So if compositor gets all of this, with some good policies, it might be 
able to take call on colorspace accurately.


Also, only from HDR implementation point-of-view, we need not to really 
know, that which REC709 implementation is this buffer actually. We just 
need to know if this is a BT2020 buffer or not, so that we an scale 
up/down the color gamut, for a proper blending scenario. So again, 
strictly for HDR playback scenario, a non-2020 buffer can be considered 
as 709 buffer, and its almost safe.


But if we want to provide a proper and accurate color management 
solution, we definitely need to know more about the buffer's 
color-space, than if its 2020 or not.

Also note that while the use of an intermediate compositing space is
essential when combining different colorspace sources together, it's
not desirable in other situations where maximum fidelity and
gamut are desired i.e. Photography. (The double conversions are a
possible accuracy loss, and it makes it very difficult to
achieve good gamut mapping from large gamut sources.)
Very valid point, and that's why I was incline towards using the display 
HW's capabilities, because modern day HWs are investing well on the 
color pipeline to handle cases like HDR. I guess once we have a mature 
basic stack working, we can think of adding something like a uses's 
preference, which could be another entry in our blending decision making 
inputs.

- For accurate blending in display engines, we need to make sure following:
 - All the buffers are in same colorspace (Rec 709 or Rec 2020)
 - All the buffers are liner (gamma/EOTF removed)
 - All the buffers are tone mapped in same zone (HDR or SDR)

Is that something that Wayland should really know about though ?
i.e. shouldn't that be an application issue, where Wayland provides
the necessary mechanisms to achieve correct composition ?
(Or in fact is that what you are suggesting ?)
Not really, I was in favor of compositor should own this, as, compositor 
is the only element which could 

Re: HDR support in Wayland/Weston

2019-01-15 Thread Sharma, Shashank

Hello Pekka,

Thanks a lot for your comments, and inputs on design, stability and 
general accessibility across all platforms.

Please find my comments, inline.

Regards
Shashank
On 1/14/2019 6:51 PM, Pekka Paalanen wrote:

On Thu, 10 Jan 2019 20:32:18 +0530
"Sharma, Shashank"  wrote:


Hello All,

This mail is to propose a design for enabling HDR support in
Wayland/Weston stack, using display engine capabilities, and get more
feedback and input from community.
Here are few points (you might already know these), about HDR
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the
right luminance at the display device.
  - The metadata can be of two type 1. static 2. dynamic . For
simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer
function) information, which is a curve (like SRGB gamma curve). One
popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709
specifications.
- For accurate blending in display engines, we need to make sure following:
  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
  - All the buffers are liner (gamma/EOTF removed)
  - All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of
a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the
two buffers, one with video (HDR) and one subtitles (SDR)
- Client passes following information to the compositor:
   - The actual buffers
   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for
SDR buffer (planning to add a new protocol extension for this)
   - The HDR metadata of the content (planning to add new protocol
for this)

- Compositors actions:
 - Reads the End display's HDR capabilities from display EDID. Assume
its an HDR HDMI monitor.
 - Compositor tone maps every view's framebuffer to match tone of end
display, applying a libVA filter. In this example:
  - The SDR subtitles frame will go through SDR to HDR tone
mapping (called S2H)
  - The HDR video frame will go through HDR to HDR tone mapping
(called H2H) if the HDR capabilities of monitor and content are different.
  - Now both the buffers and the monitor are in the same tone
mapped range.
  - As the end display is HDR capable, and one of the content frame
is HDR, the compositor will prepare all other planes for color space
conversion (CSC) from REC709->REC2020 using plane CSC property.
  - As the CSC and blending should be done in liner space, compositor
will also use plane level degamma to make the buffers linear.
  - These actions will make sure that, during blending:
  - All the buffers are in same colorspace (REC2020)
  - All the buffers are linear
  - All the buffers are tone mapped (HDR)
  - The plane level color properties patch, for DRM can be found
here: https://patchwork.freedesktop.org/series/30875/
  - Now, in order to re-apply the HDR curve, compositor will apply
CRTC level gamma, so that the output buffer is non-linear again.
  - To pass the output HDR information to kernel, so that it can
create and send AVI-info-frames to HDMI, compositor will set Connector
HDR metadata property.
  - Code for the same can be found here:
https://patchwork.freedesktop.org/series/25091/
  - And they will ever live happily after :).

Please provide inputs, feedbacks and suggestions for this design and
plan, so that we can improve out half cooked solution, and start sending
the patches.

Hi Shashank,

this is a major feature that would be awesome to have in Weston, but
it's also a big effort to design, implement, and maintain. To ease the
maintenance, we will need some serious work on the test suite, which
currently cannot even run the GL-renderer.
I agree, Weston's stability and maintenance must be high priority. 
That's why I was thinking that instead of enabling such a huge feature 
in a single shot, and trying to cover all possible cases and 
combinations, we can split it into small modular subset of features, in 
form of small patch series, review it well, and add it slowly in the 
mainstream with proper precaution.


These subsets could be ( just example)
- enabling single HDR view (like a movie playback), in fullscreen mode, 
targeting only one target colorspace (say BT2020) and one type of HDR 
metadata (say Static HDR metadata, PQ ST 2084 EOTF, HDR 10 type)
If we find this subset st

Re: HDR support in Wayland/Weston

2019-01-15 Thread Adam Jackson
On Tue, 2019-01-15 at 11:30 +0200, Pekka Paalanen wrote:
> On Tue, 15 Jan 2019 13:47:07 +1100
> Graeme Gill  wrote:
> 
> > If done in the composer, it would need to render the graphic elements to
> > the output DPI / convert the source colorspace to the output colorspace.
> > But the composer would need the code to do rendering / convert colorspaces
> > (as well as being told what the graphic elements / source colorspace is),
> > and this is not the role Wayland has - that's the responsibility of the
> > client, so instead Wayland makes it possible for the client to know what DPI
> > it is rendering to. The analogous facility for CM is for the client to know
> > what output colorspace it is rendering for.
> 
> Yes and no. Yes, we do and should let clients know what kind of outputs
> their contents will be shown on. However, we will in any case need the
> compositor to be able to do the full and correct conversion from what
> ever the client has submitted to what is correct for a specific output,
> because nothing guarantees that those two would always match.

This isn't necessarily true. The server is free to just draw a black
rectangle (or nothing) instead if the image doesn't match the target
colorspace. If you want to handle the case of cloned outputs or
crossing output borders, let the client attach one image per output
colorspace if it wants, and let the server send the client events to
indicate when it should start or stop drawing to a particular
colorspace. You need that complexity in the client's renderer anyway,
why add it to the server's?

> One wl_surface on multiple outputs is an obvious case where one buffer
> rendered by a client cannot match all the outputs it is shown on. The
> other case is transitions between outputs, where we cannot have the
> compositor wait for the client to re-draw with new color parameters.

Honestly I think of this as an implementation issue? If we take the
above multiple-images approach, then if it's my compositor I just omit
drawing actors onto any output where there isn't an image for that
colorspace, because I am comfortable saying any further latency is the
client's renderer's problem. Someone else's compositor might try to
bend the existing image to the output colorspace until the client has
caught up, believing close-but-wrong color is better than visible
absence of color. I could see an argument for either implementation
depending on the environment, and I don't really see why the protocol
spec should require one or the other.

- ajax
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-15 Thread Niels Ole Salscheider
Am Dienstag, 15. Januar 2019, 10:30:14 CET schrieb Pekka Paalanen:
> On Tue, 15 Jan 2019 13:47:07 +1100
> 
> Graeme Gill  wrote:
> > Pekka Paalanen wrote:
> > 
> > Hi Pekka,
> > 
> > thanks for your response.
> > 
> > >> As far as I was informed, Wayland
> > >> is architected in such a way that this is not possible, since clients
> > >> have no knowledge of which display the pixels they send will end up on.
> > > 
> > > Nothing has changed there.
> > 
> > I've been pondering the various Color Management (CM) approaches to
> > working around this limitation, but I keep coming back to it
> > as the most fruitful direction to talk about. The main reasons
> > are that this implies the least extra burden on Wayland implementations,
> > and is most consonant with current application and GUI toolkit
> > CM code.
> > 
> > And in fact Wayland has already been changed in this direction already, to
> > accommodate a highly analogous requirement to that of Color Management :-
> > HiDPI. So as best I understand it, because (unlike X11) Wayland does not
> > do rendering, it is not reasonable for it to suddenly re-render
> > application graphics at hi resolution - the best that it can do is scale
> > the pixels, leading to poorer visual quality than is possible on HiDPI
> > displays. So HiDPI aware applications have to know when they are
> > rendering for a HiDPI output, and scale their buffers and rendering
> > accordingly, and tell Wayland that they have done so using
> > wl_surface.set_buffer_scale. [ I am not currently clear on how the
> > situation of a user window straddling two displays of differing DPI is
> > handled. ]
> 
> Hi Graeme,
> 
> your understanding of how HiDPI works on Wayland is correct.
> 
> If a wl_surface straddles multiple outputs simultaneously, then
> wl_surface.enter/leave events indicate the surface being on all those
> outputs at the same time. The client is expected to take all the
> entered outputs into consideration when it chooses how to render its
> image. For HiDPI, this usually means taking the maximum output scale
> from that set of outputs. The compositor will then automatically
> (down)scale for the other outputs accordingly.

> This scheme also means that the compositor does not necessarily need to
> wait for a client to render when the outputs suddenly change. It knows
> how to transform the existing image for new outputs already. The visual
> quality may jump afterwards when the client catches up, but there is no
> window blinking in and out of existence.
> 
> > The CM situation is highly analogous - like the DPI, the colorspace
> > (profile) of each output may be different for different displays, so for
> > highest quality output, something in the graphics chain needs to
> > accommodate it. To do so the relevant information is needed :- what
> > output is being rendered to, and what its characteristic is (DPI / Color
> > Profile).
> Yes, indeed.
> 
> > If done in the composer, it would need to render the graphic elements to
> > the output DPI / convert the source colorspace to the output colorspace.
> > But the composer would need the code to do rendering / convert colorspaces
> > (as well as being told what the graphic elements / source colorspace is),
> > and this is not the role Wayland has - that's the responsibility of the
> > client, so instead Wayland makes it possible for the client to know what
> > DPI it is rendering to. The analogous facility for CM is for the client
> > to know what output colorspace it is rendering for.
> 
> Yes and no. Yes, we do and should let clients know what kind of outputs
> their contents will be shown on. However, we will in any case need the
> compositor to be able to do the full and correct conversion from what
> ever the client has submitted to what is correct for a specific output,
> because nothing guarantees that those two would always match.
> 
> One wl_surface on multiple outputs is an obvious case where one buffer
> rendered by a client cannot match all the outputs it is shown on. The
> other case is transitions between outputs, where we cannot have the
> compositor wait for the client to re-draw with new color parameters.

I think the last proposal of a color management protocol that we discussed 
does that. It contains the device link profiles and it also allows the client 
to query the profile of wl_outputs. With that, an application can display 
accurate colors in nearly every situation, even on multiple screens 
simultaneously. But still the compositor can do it's best to provide a good 
output in some corner cases (e. g. when a new screen is activated and the 
application has not rendered a new frame yet). Once the application reacts to 
that change the output will be perfect again.

> > The clean/simplest approach to HDR is to treat it as just another output
> > colorspace, where it is up to the application to render the color
> > it intends to display, and the Wayland compositor is compositing
> > everything
> > 

Re: HDR support in Wayland/Weston

2019-01-15 Thread Pekka Paalanen
On Tue, 15 Jan 2019 13:47:07 +1100
Graeme Gill  wrote:

> Pekka Paalanen wrote:
> 
> Hi Pekka,
>   thanks for your response.
> 
> >> As far as I was informed, Wayland
> >> is architected in such a way that this is not possible, since clients
> >> have no knowledge of which display the pixels they send will end up on.  
> > 
> > Nothing has changed there.  
> 
> I've been pondering the various Color Management (CM) approaches to
> working around this limitation, but I keep coming back to it
> as the most fruitful direction to talk about. The main reasons
> are that this implies the least extra burden on Wayland implementations,
> and is most consonant with current application and GUI toolkit
> CM code.
> 
> And in fact Wayland has already been changed in this direction already, to
> accommodate a highly analogous requirement to that of Color Management :- 
> HiDPI.
> So as best I understand it, because (unlike X11) Wayland does not do
> rendering, it is not reasonable for it to suddenly re-render application
> graphics at hi resolution - the best that it can do is scale the pixels,
> leading to poorer visual quality than is possible on HiDPI displays.
> So HiDPI aware applications have to know when they are rendering for a HiDPI
> output, and scale their buffers and rendering accordingly, and tell Wayland 
> that
> they have done so using wl_surface.set_buffer_scale. [ I am not currently
> clear on how the situation of a user window straddling two displays of
> differing DPI is handled. ]

Hi Graeme,

your understanding of how HiDPI works on Wayland is correct.

If a wl_surface straddles multiple outputs simultaneously, then
wl_surface.enter/leave events indicate the surface being on all those
outputs at the same time. The client is expected to take all the
entered outputs into consideration when it chooses how to render its
image. For HiDPI, this usually means taking the maximum output scale
from that set of outputs. The compositor will then automatically
(down)scale for the other outputs accordingly.

This scheme also means that the compositor does not necessarily need to
wait for a client to render when the outputs suddenly change. It knows
how to transform the existing image for new outputs already. The visual
quality may jump afterwards when the client catches up, but there is no
window blinking in and out of existence.

> The CM situation is highly analogous - like the DPI, the colorspace (profile)
> of each output may be different for different displays, so for highest
> quality output, something in the graphics chain needs to accommodate it.
> To do so the relevant information is needed :- what output is being
> rendered to, and what its characteristic is (DPI / Color Profile).

Yes, indeed.

> If done in the composer, it would need to render the graphic elements to
> the output DPI / convert the source colorspace to the output colorspace.
> But the composer would need the code to do rendering / convert colorspaces
> (as well as being told what the graphic elements / source colorspace is),
> and this is not the role Wayland has - that's the responsibility of the
> client, so instead Wayland makes it possible for the client to know what DPI
> it is rendering to. The analogous facility for CM is for the client to know
> what output colorspace it is rendering for.

Yes and no. Yes, we do and should let clients know what kind of outputs
their contents will be shown on. However, we will in any case need the
compositor to be able to do the full and correct conversion from what
ever the client has submitted to what is correct for a specific output,
because nothing guarantees that those two would always match.

One wl_surface on multiple outputs is an obvious case where one buffer
rendered by a client cannot match all the outputs it is shown on. The
other case is transitions between outputs, where we cannot have the
compositor wait for the client to re-draw with new color parameters.

> The clean/simplest approach to HDR is to treat it as just another output
> colorspace, where it is up to the application to render the color
> it intends to display, and the Wayland compositor is compositing everything
> for that output in that HDR colorspace.

Agreed.

> Now practically speaking this would assume that all Wayland clients connected
> to an HDR display in HDR mode are CM/HDR aware, which is rather unlikely. So
> some backward compatibility modes might be highly desirable (I have
> some thoughts on that), but in any case, it would also help the quality
> of such backward compatibility _and_ compositing (i.e. linear light
> compositing option), if Wayland at least had access to the output color
> profiles. So there is a lot of advantage in Wayland providing the
> registry/API of output color profiles both for itself, and clients.

That backward compatibility / fallback is an integral part of the image
transformations the compositor must be able to do in any case. So yes,
I agree.

For instance, the whole 

Re: HDR support in Wayland/Weston

2019-01-14 Thread Graeme Gill
Pekka Paalanen wrote:

Hi Pekka,
thanks for your response.

>> As far as I was informed, Wayland
>> is architected in such a way that this is not possible, since clients
>> have no knowledge of which display the pixels they send will end up on.
> 
> Nothing has changed there.

I've been pondering the various Color Management (CM) approaches to
working around this limitation, but I keep coming back to it
as the most fruitful direction to talk about. The main reasons
are that this implies the least extra burden on Wayland implementations,
and is most consonant with current application and GUI toolkit
CM code.

And in fact Wayland has already been changed in this direction already, to
accommodate a highly analogous requirement to that of Color Management :- HiDPI.
So as best I understand it, because (unlike X11) Wayland does not do
rendering, it is not reasonable for it to suddenly re-render application
graphics at hi resolution - the best that it can do is scale the pixels,
leading to poorer visual quality than is possible on HiDPI displays.
So HiDPI aware applications have to know when they are rendering for a HiDPI
output, and scale their buffers and rendering accordingly, and tell Wayland that
they have done so using wl_surface.set_buffer_scale. [ I am not currently
clear on how the situation of a user window straddling two displays of
differing DPI is handled. ]

The CM situation is highly analogous - like the DPI, the colorspace (profile)
of each output may be different for different displays, so for highest
quality output, something in the graphics chain needs to accommodate it.
To do so the relevant information is needed :- what output is being
rendered to, and what its characteristic is (DPI / Color Profile).

If done in the composer, it would need to render the graphic elements to
the output DPI / convert the source colorspace to the output colorspace.
But the composer would need the code to do rendering / convert colorspaces
(as well as being told what the graphic elements / source colorspace is),
and this is not the role Wayland has - that's the responsibility of the
client, so instead Wayland makes it possible for the client to know what DPI
it is rendering to. The analogous facility for CM is for the client to know
what output colorspace it is rendering for.

The clean/simplest approach to HDR is to treat it as just another output
colorspace, where it is up to the application to render the color
it intends to display, and the Wayland compositor is compositing everything
for that output in that HDR colorspace.

Now practically speaking this would assume that all Wayland clients connected
to an HDR display in HDR mode are CM/HDR aware, which is rather unlikely. So
some backward compatibility modes might be highly desirable (I have
some thoughts on that), but in any case, it would also help the quality
of such backward compatibility _and_ compositing (i.e. linear light
compositing option), if Wayland at least had access to the output color
profiles. So there is a lot of advantage in Wayland providing the
registry/API of output color profiles both for itself, and clients.

> Wayland and apps need to provide the compositor all the necessary
> information for the compositor to do all the conversions, mapping and
> blending correctly, if it has to.

And perhaps it shouldn't have to.

> This is because an application will provide only one image for a
> wl_surface, and the compositor may show that on any number of any kind
> of outputs at once.

That's a big problem. That assumes either that all displays are interchangeable
at the pixel level (they are not - they have different DPI and 
Colorspaces/Gamut/HDR
capability), or that Wayland has to know how to re-render to accommodate those
difference (and Wayland doesn't do rendering, and may not want to include
Colorspace conversion machinery or HDR colorspace/conversion machinery).

> Nothing prevents adding more protocol to give apps more hints to behave
> more optimally wrt. to the compositor's internal pipeline.

There's a difference between hints that help speed and quality but
still burden the compositor with doing display dependent conversions,
and an approach that removes hat burden completely to the client when
it does the rendering.

> Correct, and Shashank is not proposing anything different. The
> per-channel lookup curves are a compositor internal detail, always
> programmed by the compositor correctly according to what it happens to
> be showing each monitor refresh on that specific monitor.

Since the compositor is not Color Management aware, then by definition
it can't set the CRTC to the correct values ("correct" in terms
of the color sensitive end users intentions for how they need their systems
and applications to work.) Now if the compositor was CM aware, it could
choose whether to implement CM display calibration curves by using
the hardware CRTC, or by implementing it in some other fashion such
as with a shader, or (ideally) 

Re: HDR support in Wayland/Weston

2019-01-14 Thread Pekka Paalanen
On Fri, 11 Jan 2019 18:25:01 +1100
Graeme Gill  wrote:

> Sharma, Shashank wrote:
> 
> Hi,
> 
> While I'm sure you could hard code various color space assumptions into
> such an implementation (and perhaps this is a pretty reasonable way
> of doing a proof of concept), it's not a good long term solution,
> and could end up being something of a millstone. What's missing
> is any serous Color Management in Wayland. It's a bigger project
> to fix that, but HDR would then be able to slot into a much more usable
> framework.

Hi,

that I agree with. At least the final design should aim to integrate
with color management related public application protocol extensions.

In other words, I think it would be ok to use place-holder extensions
if the color management extensions are not there yet, but plan for
swapping to the proper extension later when you aim for the larger
public.

In practise, that would mean more than one Wayland global interface for
HDR purposes, so that those that deal with color spaces can later be
dropped in favour of the color extensions instead of stabilized.

> > - HDR content/buffers are composed in REC2020 colorspace, with bit depth 
> > 10/12/16 BPC.
> > Some of the popular formats are P010,P012,P016.  
> 
> While REC2020 based HDR colorspaces are very popular, they aren't the only 
> ones out there.
> 
> > - Normal SRGB buffers are composed in SRGB color space following REC709 
> > specifications.  
> 
> As far as I'm aware (I haven't been closely monitoring this mailing
> list since the last rather long and unsatisfactory discussion about color
> management), Wayland works in display dependent colorspace, since there
> is no facility for it to know how to convert from anything else to the
> display space (i.e. no knowledge of display profiles so it doesn't
> know what sRGB is). In all other computer graphic display systems, it's
> up to the client to be informed about each display colorspace is, and
> to do color conversion to the display space either itself, or by using
> operating system libraries. The operating system provides the display
> profile information to allow this. As far as I was informed, Wayland
> is architected in such a way that this is not possible, since clients
> have no knowledge of which display the pixels they send will end up on.

Nothing has changed there.

> Also note that while the use of an intermediate compositing space is
> essential when combining different colorspace sources together, it's
> not desirable in other situations where maximum fidelity and
> gamut are desired i.e. Photography. (The double conversions are a
> possible accuracy loss, and it makes it very difficult to
> achieve good gamut mapping from large gamut sources.)

Right, ideally a series of conversions that should amount to identity
would be completely skipped on a frame by frame basis. Thankfully that
will be a compositor internal implementation detail - there should be
no protocol that demands unnecessary work.

> > - For accurate blending in display engines, we need to make sure following:
> >     - All the buffers are in same colorspace (Rec 709 or Rec 2020)
> >     - All the buffers are liner (gamma/EOTF removed)
> >     - All the buffers are tone mapped in same zone (HDR or SDR)  
> 
> Is that something that Wayland should really know about though ?
> i.e. shouldn't that be an application issue, where Wayland provides
> the necessary mechanisms to achieve correct composition ?
> (Or in fact is that what you are suggesting ?)

Wayland and apps need to provide the compositor all the necessary
information for the compositor to do all the conversions, mapping and
blending correctly, if it has to.

This is because an application will provide only one image for a
wl_surface, and the compositor may show that on any number of any kind
of outputs at once. The alternative would be for apps to provide one
image per each output for one wl_surface, so that the compositor can
pick the correctly rendered one - except that would still not allow the
compositor to blend correctly, it would only work for opaque windows.
It would also prohibit the use of display hardware to off-load any part
of the image processing.

But, just like HiDPI, it is not only the apps telling the compositor
what kind of image they are providing (wl_surface.set_buffer_scale
request). The compositor describes the outputs (wl_output.scale event)
so that applications can choose the best way to render, and maybe save
some work in the compositor at the same time.

Nothing prevents adding more protocol to give apps more hints to behave
more optimally wrt. to the compositor's internal pipeline.

> >     - Now, in order to re-apply the HDR curve, compositor will apply CRTC 
> > level gamma, so
> > that the output buffer is non-linear again.  
> 
> Note that in most Color Managed systems, the CRTC per channel lookup curves 
> are used for
> the purposes of display calibration, although Wayland doesn't currently 
> support
> Color 

Re: HDR support in Wayland/Weston

2019-01-14 Thread Pekka Paalanen
On Thu, 10 Jan 2019 20:32:18 +0530
"Sharma, Shashank"  wrote:

> Hello All,
> 
> This mail is to propose a design for enabling HDR support in 
> Wayland/Weston stack, using display engine capabilities, and get more 
> feedback and input from community.
> Here are few points (you might already know these), about HDR 
> framebuffers, videos and displays:
> - HDR content/buffers are composed in REC2020 colorspace, with bit depth 
> 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> - HDR content come with their own Metadata to be applied to get the 
> right luminance at the display device.
>  - The metadata can be of two type 1. static 2. dynamic . For 
> simplicity, this solution is focusing on static HDR only (HDR10 standard)
> - HDR content also provide its supported EOTF (electro optical transfer 
> function) information, which is a curve (like SRGB gamma curve). One 
> popular EOTF is PQ(ST2084).
> - HDR capable displays mention their EOTF and HDR metadata support 
> information in EDID CEA-861-G blocks.
> - Normal SRGB buffers are composed in SRGB color space following REC709 
> specifications.
> - For accurate blending in display engines, we need to make sure following:
>  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
>  - All the buffers are liner (gamma/EOTF removed)
>  - All the buffers are tone mapped in same zone (HDR or SDR)
> 
> Please refer to the block diagram below, which presents a simple case of 
> a HDR P010 movie playback, with HDR buffers as video buffers, and SDR 
> buffers as subtitles. The subsystem looks and works like this:
> - A client decodes the buffer (using FFMpeg for example) and gets the 
> two buffers, one with video (HDR) and one subtitles (SDR)
> - Client passes following information to the compositor:
>   - The actual buffers
>   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for 
> SDR buffer (planning to add a new protocol extension for this)
>   - The HDR metadata of the content (planning to add new protocol 
> for this)
> 
> - Compositors actions:
> - Reads the End display's HDR capabilities from display EDID. Assume 
> its an HDR HDMI monitor.
> - Compositor tone maps every view's framebuffer to match tone of end 
> display, applying a libVA filter. In this example:
>  - The SDR subtitles frame will go through SDR to HDR tone 
> mapping (called S2H)
>  - The HDR video frame will go through HDR to HDR tone mapping 
> (called H2H) if the HDR capabilities of monitor and content are different.
>  - Now both the buffers and the monitor are in the same tone 
> mapped range.
>  - As the end display is HDR capable, and one of the content frame 
> is HDR, the compositor will prepare all other planes for color space 
> conversion (CSC) from REC709->REC2020 using plane CSC property.
>  - As the CSC and blending should be done in liner space, compositor 
> will also use plane level degamma to make the buffers linear.
>  - These actions will make sure that, during blending:
>  - All the buffers are in same colorspace (REC2020)
>  - All the buffers are linear
>  - All the buffers are tone mapped (HDR)
>  - The plane level color properties patch, for DRM can be found 
> here: https://patchwork.freedesktop.org/series/30875/
>  - Now, in order to re-apply the HDR curve, compositor will apply 
> CRTC level gamma, so that the output buffer is non-linear again.
>  - To pass the output HDR information to kernel, so that it can 
> create and send AVI-info-frames to HDMI, compositor will set Connector 
> HDR metadata property.
>  - Code for the same can be found here: 
> https://patchwork.freedesktop.org/series/25091/
>  - And they will ever live happily after :).
> 
> Please provide inputs, feedbacks and suggestions for this design and 
> plan, so that we can improve out half cooked solution, and start sending 
> the patches.

Hi Shashank,

this is a major feature that would be awesome to have in Weston, but
it's also a big effort to design, implement, and maintain. To ease the
maintenance, we will need some serious work on the test suite, which
currently cannot even run the GL-renderer.

I understand your aim is to leverage display hardware capabilities to
the fullest, but we must also consider hardware that lacks some or all
of the conversion/mapping/other features while the monitor is well
HDR-capable. We also need to consider what happens when a monitor is
not HDR-capable or is somehow lacking. OTOH, whether a compositor
implements HDR support at all would be obvious in the advertised
Wayland globals and pixel formats.

Do we want to support HDR output in some way even if the display engine
(in 

Re: HDR support in Wayland/Weston

2019-01-10 Thread Graeme Gill
Sharma, Shashank wrote:

Hi,

While I'm sure you could hard code various color space assumptions into
such an implementation (and perhaps this is a pretty reasonable way
of doing a proof of concept), it's not a good long term solution,
and could end up being something of a millstone. What's missing
is any serous Color Management in Wayland. It's a bigger project
to fix that, but HDR would then be able to slot into a much more usable
framework.

> - HDR content/buffers are composed in REC2020 colorspace, with bit depth 
> 10/12/16 BPC.
> Some of the popular formats are P010,P012,P016.

While REC2020 based HDR colorspaces are very popular, they aren't the only ones 
out there.

> - Normal SRGB buffers are composed in SRGB color space following REC709 
> specifications.

As far as I'm aware (I haven't been closely monitoring this mailing
list since the last rather long and unsatisfactory discussion about color
management), Wayland works in display dependent colorspace, since there
is no facility for it to know how to convert from anything else to the
display space (i.e. no knowledge of display profiles so it doesn't
know what sRGB is). In all other computer graphic display systems, it's
up to the client to be informed about each display colorspace is, and
to do color conversion to the display space either itself, or by using
operating system libraries. The operating system provides the display
profile information to allow this. As far as I was informed, Wayland
is architected in such a way that this is not possible, since clients
have no knowledge of which display the pixels they send will end up on.

Also note that while the use of an intermediate compositing space is
essential when combining different colorspace sources together, it's
not desirable in other situations where maximum fidelity and
gamut are desired i.e. Photography. (The double conversions are a
possible accuracy loss, and it makes it very difficult to
achieve good gamut mapping from large gamut sources.)

> - For accurate blending in display engines, we need to make sure following:
>     - All the buffers are in same colorspace (Rec 709 or Rec 2020)
>     - All the buffers are liner (gamma/EOTF removed)
>     - All the buffers are tone mapped in same zone (HDR or SDR)

Is that something that Wayland should really know about though ?
i.e. shouldn't that be an application issue, where Wayland provides
the necessary mechanisms to achieve correct composition ?
(Or in fact is that what you are suggesting ?)

>     - Now, in order to re-apply the HDR curve, compositor will apply CRTC 
> level gamma, so
> that the output buffer is non-linear again.

Note that in most Color Managed systems, the CRTC per channel lookup curves are 
used for
the purposes of display calibration, although Wayland doesn't currently support
Color Management tools at all (unlike X11 or other operating systems, it 
provides no
access to the CRTC by client software).

Perhaps it's worth re-visiting some of the ideas about how to add Color 
Management
to Wayland, to see how HDR could fit into it ?

regards,
Graeme Gill
(Author of ArgyllCMS )
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: HDR support in Wayland/Weston

2019-01-10 Thread Sharma, Shashank

Hello Ole,


On 1/10/2019 9:31 PM, Niels Ole Salscheider wrote:

Hello,

on a first glance this sounds sensible. Would it work well with the last color
management protocol proposal that I made or do you see issues there?
We could add REC2020 as another predefined profile.

https://lists.freedesktop.org/archives/wayland-devel/2017-January/032769.html
I think the last proposal was mostly sane and usable for everybody, but there
was not much interest afterwards. However, there was a lot of discussion with
wishes from different sides that went into this. The relevant mailing list
threads are the following, but you have to follow the discussion over the next
months:

https://lists.freedesktop.org/archives/wayland-devel/2016-November/031728.html
https://lists.freedesktop.org/archives/wayland-devel/2014-March/013951.html
Thanks for sharing the links, let me go through the discussion history, 
I will come back with my feedback for this stack, soon.

- Shashank

Best regards,
Ole

Am Donnerstag, 10. Januar 2019, 16:02:18 CET schrieb Sharma, Shashank:

Hello All,

This mail is to propose a design for enabling HDR support in
Wayland/Weston stack, using display engine capabilities, and get more
feedback and input from community.
Here are few points (you might already know these), about HDR
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the
right luminance at the display device.
  - The metadata can be of two type 1. static 2. dynamic . For
simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer
function) information, which is a curve (like SRGB gamma curve). One
popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709
specifications.
- For accurate blending in display engines, we need to make sure following:
  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
  - All the buffers are liner (gamma/EOTF removed)
  - All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of
a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the
two buffers, one with video (HDR) and one subtitles (SDR)
- Client passes following information to the compositor:
   - The actual buffers
   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for
SDR buffer (planning to add a new protocol extension for this)
   - The HDR metadata of the content (planning to add new protocol
for this)

- Compositors actions:
 - Reads the End display's HDR capabilities from display EDID. Assume
its an HDR HDMI monitor.
 - Compositor tone maps every view's framebuffer to match tone of end
display, applying a libVA filter. In this example:
  - The SDR subtitles frame will go through SDR to HDR tone
mapping (called S2H)
  - The HDR video frame will go through HDR to HDR tone mapping
(called H2H) if the HDR capabilities of monitor and content are different.
  - Now both the buffers and the monitor are in the same tone
mapped range.
  - As the end display is HDR capable, and one of the content frame
is HDR, the compositor will prepare all other planes for color space
conversion (CSC) from REC709->REC2020 using plane CSC property.
  - As the CSC and blending should be done in liner space, compositor
will also use plane level degamma to make the buffers linear.
  - These actions will make sure that, during blending:
  - All the buffers are in same colorspace (REC2020)
  - All the buffers are linear
  - All the buffers are tone mapped (HDR)
  - The plane level color properties patch, for DRM can be found
here: https://patchwork.freedesktop.org/series/30875/
  - Now, in order to re-apply the HDR curve, compositor will apply
CRTC level gamma, so that the output buffer is non-linear again.
  - To pass the output HDR information to kernel, so that it can
create and send AVI-info-frames to HDMI, compositor will set Connector
HDR metadata property.
  - Code for the same can be found here:
https://patchwork.freedesktop.org/series/25091/
  - And they will ever live happily after :).

Please provide inputs, feedbacks and suggestions for this design and
plan, so that we can improve out half cooked solution, and start sending
the patches.

   +--+ +---+

   | SDR Buffer subtitles   | HDR Buffer vi

Re: HDR support in Wayland/Weston

2019-01-10 Thread Niels Ole Salscheider
Hello,

on a first glance this sounds sensible. Would it work well with the last color 
management protocol proposal that I made or do you see issues there?
We could add REC2020 as another predefined profile.

https://lists.freedesktop.org/archives/wayland-devel/2017-January/032769.html

I think the last proposal was mostly sane and usable for everybody, but there 
was not much interest afterwards. However, there was a lot of discussion with 
wishes from different sides that went into this. The relevant mailing list 
threads are the following, but you have to follow the discussion over the next 
months:

https://lists.freedesktop.org/archives/wayland-devel/2016-November/031728.html
https://lists.freedesktop.org/archives/wayland-devel/2014-March/013951.html

Best regards,
Ole

Am Donnerstag, 10. Januar 2019, 16:02:18 CET schrieb Sharma, Shashank:
> Hello All,
> 
> This mail is to propose a design for enabling HDR support in
> Wayland/Weston stack, using display engine capabilities, and get more
> feedback and input from community.
> Here are few points (you might already know these), about HDR
> framebuffers, videos and displays:
> - HDR content/buffers are composed in REC2020 colorspace, with bit depth
> 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
> - HDR content come with their own Metadata to be applied to get the
> right luminance at the display device.
>  - The metadata can be of two type 1. static 2. dynamic . For
> simplicity, this solution is focusing on static HDR only (HDR10 standard)
> - HDR content also provide its supported EOTF (electro optical transfer
> function) information, which is a curve (like SRGB gamma curve). One
> popular EOTF is PQ(ST2084).
> - HDR capable displays mention their EOTF and HDR metadata support
> information in EDID CEA-861-G blocks.
> - Normal SRGB buffers are composed in SRGB color space following REC709
> specifications.
> - For accurate blending in display engines, we need to make sure following:
>  - All the buffers are in same colorspace (Rec 709 or Rec 2020)
>  - All the buffers are liner (gamma/EOTF removed)
>  - All the buffers are tone mapped in same zone (HDR or SDR)
> 
> Please refer to the block diagram below, which presents a simple case of
> a HDR P010 movie playback, with HDR buffers as video buffers, and SDR
> buffers as subtitles. The subsystem looks and works like this:
> - A client decodes the buffer (using FFMpeg for example) and gets the
> two buffers, one with video (HDR) and one subtitles (SDR)
> - Client passes following information to the compositor:
>   - The actual buffers
>   - Their colorspace infromation, BT2020 for HDR buffer, REC709 for
> SDR buffer (planning to add a new protocol extension for this)
>   - The HDR metadata of the content (planning to add new protocol
> for this)
> 
> - Compositors actions:
> - Reads the End display's HDR capabilities from display EDID. Assume
> its an HDR HDMI monitor.
> - Compositor tone maps every view's framebuffer to match tone of end
> display, applying a libVA filter. In this example:
>  - The SDR subtitles frame will go through SDR to HDR tone
> mapping (called S2H)
>  - The HDR video frame will go through HDR to HDR tone mapping
> (called H2H) if the HDR capabilities of monitor and content are different.
>  - Now both the buffers and the monitor are in the same tone
> mapped range.
>  - As the end display is HDR capable, and one of the content frame
> is HDR, the compositor will prepare all other planes for color space
> conversion (CSC) from REC709->REC2020 using plane CSC property.
>  - As the CSC and blending should be done in liner space, compositor
> will also use plane level degamma to make the buffers linear.
>  - These actions will make sure that, during blending:
>  - All the buffers are in same colorspace (REC2020)
>  - All the buffers are linear
>  - All the buffers are tone mapped (HDR)
>  - The plane level color properties patch, for DRM can be found
> here: https://patchwork.freedesktop.org/series/30875/
>  - Now, in order to re-apply the HDR curve, compositor will apply
> CRTC level gamma, so that the output buffer is non-linear again.
>  - To pass the output HDR information to kernel, so that it can
> create and send AVI-info-frames to HDMI, compositor will set Connector
> HDR metadata property.
>  - Code for the same can be found here:
> https://patchwork.freedesktop.org/series/25091/
>  - And they will ever live happily after :).
> 
> Please provide inputs, feedbacks and suggestions for this design and
> plan, so that we can improve out half cooked solution, and start sending
> the patches.
&

Re: HDR support in Wayland/Weston

2019-01-10 Thread Sharma, Shashank
If the block diagram is not aligned due to mail client, please refer to 
the attached .txt file. Hope thats slightly better :).


Regards

Shashank

On 1/10/2019 8:32 PM, Sharma, Shashank wrote:

Hello All,

This mail is to propose a design for enabling HDR support in 
Wayland/Weston stack, using display engine capabilities, and get more 
feedback and input from community.
Here are few points (you might already know these), about HDR 
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit 
depth 10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the 
right luminance at the display device.
- The metadata can be of two type 1. static 2. dynamic . For 
simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical 
transfer function) information, which is a curve (like SRGB gamma 
curve). One popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support 
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following 
REC709 specifications.
- For accurate blending in display engines, we need to make sure 
following:

- All the buffers are in same colorspace (Rec 709 or Rec 2020)
- All the buffers are liner (gamma/EOTF removed)
- All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case 
of a HDR P010 movie playback, with HDR buffers as video buffers, and 
SDR buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the 
two buffers, one with video (HDR) and one subtitles (SDR)

- Client passes following information to the compositor:
 - The actual buffers
 - Their colorspace infromation, BT2020 for HDR buffer, REC709 for 
SDR buffer (planning to add a new protocol extension for this)
 - The HDR metadata of the content (planning to add new protocol 
for this)


- Compositors actions:
   - Reads the End display's HDR capabilities from display EDID. 
Assume its an HDR HDMI monitor.
   - Compositor tone maps every view's framebuffer to match tone of 
end display, applying a libVA filter. In this example:
- The SDR subtitles frame will go through SDR to HDR tone 
mapping (called S2H)
- The HDR video frame will go through HDR to HDR tone mapping 
(called H2H) if the HDR capabilities of monitor and content are different.
- Now both the buffers and the monitor are in the same tone 
mapped range.
- As the end display is HDR capable, and one of the content frame 
is HDR, the compositor will prepare all other planes for color space 
conversion (CSC) from REC709->REC2020 using plane CSC property.
- As the CSC and blending should be done in liner space, 
compositor will also use plane level degamma to make the buffers linear.

- These actions will make sure that, during blending:
- All the buffers are in same colorspace (REC2020)
- All the buffers are linear
- All the buffers are tone mapped (HDR)
- The plane level color properties patch, for DRM can be found 
here: https://patchwork.freedesktop.org/series/30875/
- Now, in order to re-apply the HDR curve, compositor will apply 
CRTC level gamma, so that the output buffer is non-linear again.
- To pass the output HDR information to kernel, so that it can 
create and send AVI-info-frames to HDMI, compositor will set Connector 
HDR metadata property.
- Code for the same can be found here: 
https://patchwork.freedesktop.org/series/25091/

- And they will ever live happily after :).

Please provide inputs, feedbacks and suggestions for this design and 
plan, so that we can improve out half cooked solution, and start 
sending the patches.


 +--+ +---+
 | SDR Buffer subtitles   | HDR Buffer video
 | (REC  709 colorsp) | (REC 2020 colorsp |
 |  | |   |
 +---+--+ +---+---+
 ||
 ||
 ||
+--v---v+ +--+
  |   Compositor: v   | | 
LibVA|
  |   - assigns views to overlays 
+-> Tone mapping |
  |   - prepare plane/CRTC color properties   
<-+ SDR to HDR   |
  | for linear blending in display
| | H

HDR support in Wayland/Weston

2019-01-10 Thread Sharma, Shashank

Hello All,

This mail is to propose a design for enabling HDR support in 
Wayland/Weston stack, using display engine capabilities, and get more 
feedback and input from community.
Here are few points (you might already know these), about HDR 
framebuffers, videos and displays:
- HDR content/buffers are composed in REC2020 colorspace, with bit depth 
10/12/16 BPC. Some of the popular formats are P010,P012,P016.
- HDR content come with their own Metadata to be applied to get the 
right luminance at the display device.
- The metadata can be of two type 1. static 2. dynamic . For 
simplicity, this solution is focusing on static HDR only (HDR10 standard)
- HDR content also provide its supported EOTF (electro optical transfer 
function) information, which is a curve (like SRGB gamma curve). One 
popular EOTF is PQ(ST2084).
- HDR capable displays mention their EOTF and HDR metadata support 
information in EDID CEA-861-G blocks.
- Normal SRGB buffers are composed in SRGB color space following REC709 
specifications.

- For accurate blending in display engines, we need to make sure following:
- All the buffers are in same colorspace (Rec 709 or Rec 2020)
- All the buffers are liner (gamma/EOTF removed)
- All the buffers are tone mapped in same zone (HDR or SDR)

Please refer to the block diagram below, which presents a simple case of 
a HDR P010 movie playback, with HDR buffers as video buffers, and SDR 
buffers as subtitles. The subsystem looks and works like this:
- A client decodes the buffer (using FFMpeg for example) and gets the 
two buffers, one with video (HDR) and one subtitles (SDR)

- Client passes following information to the compositor:
 - The actual buffers
 - Their colorspace infromation, BT2020 for HDR buffer, REC709 for 
SDR buffer (planning to add a new protocol extension for this)
 - The HDR metadata of the content (planning to add new protocol 
for this)


- Compositors actions:
   - Reads the End display's HDR capabilities from display EDID. Assume 
its an HDR HDMI monitor.
   - Compositor tone maps every view's framebuffer to match tone of end 
display, applying a libVA filter. In this example:
- The SDR subtitles frame will go through SDR to HDR tone 
mapping (called S2H)
- The HDR video frame will go through HDR to HDR tone mapping 
(called H2H) if the HDR capabilities of monitor and content are different.
- Now both the buffers and the monitor are in the same tone 
mapped range.
- As the end display is HDR capable, and one of the content frame 
is HDR, the compositor will prepare all other planes for color space 
conversion (CSC) from REC709->REC2020 using plane CSC property.
- As the CSC and blending should be done in liner space, compositor 
will also use plane level degamma to make the buffers linear.

- These actions will make sure that, during blending:
- All the buffers are in same colorspace (REC2020)
- All the buffers are linear
- All the buffers are tone mapped (HDR)
- The plane level color properties patch, for DRM can be found 
here: https://patchwork.freedesktop.org/series/30875/
- Now, in order to re-apply the HDR curve, compositor will apply 
CRTC level gamma, so that the output buffer is non-linear again.
- To pass the output HDR information to kernel, so that it can 
create and send AVI-info-frames to HDMI, compositor will set Connector 
HDR metadata property.
- Code for the same can be found here: 
https://patchwork.freedesktop.org/series/25091/

- And they will ever live happily after :).

Please provide inputs, feedbacks and suggestions for this design and 
plan, so that we can improve out half cooked solution, and start sending 
the patches.


 +--+ +---+
 | SDR Buffer subtitles   | HDR Buffer video
 | (REC  709 colorsp) | (REC 2020 colorsp |
 |  | |   |
 +---+--+ +---+---+
 ||
 ||
 ||
+--v---v+ +--+
  |   Compositor: v   | | 
LibVA|
  |   - assigns views to overlays 
+-> Tone mapping |
  |   - prepare plane/CRTC color properties   
<-+ SDR to HDR   |
  | for linear blending in display
| | HDR to SDR   |

+--+-+--+ +--+
 | |
 | Tone mapped | Tone mapped
 | non-linear-Rec709   | non-linear