Re: [RFC wayland-protocols v2 1/1] Add the color-management protocol

2019-02-28 Thread Chris Murphy
On Thu, Feb 28, 2019 at 4:37 AM Pekka Paalanen  wrote:
>
> another thought about a compositor implementation detail I would like
> to ask you all is about the blending space.
>
> If the compositor blending space was CIE XYZ with direct (linear)
> encoding to IEEE754 32-bit float values in pixels, with the units of Y
> chosen to match an absolute physical luminance value (or something that
> corresponds with the HDR specifications), would that be sufficient for
> all imaginable and realistic color reproduction purposes, HDR included?

CIE XYZ doesn't really have per se limits. It's always possible to
just add more photons, even if things start catching fire.

You can pick sRGB/Rec.709 primaries and define points inside or
outside those primaries, with 32-bit FP precision. This was the
rationalization used in the scRGB color space.
https://en.wikipedia.org/wiki/ScRGB

openEXR assumes Rec.709 primaries if not specified, but quite a bit
more dynamic range than scRGB.
http://www.openexr.com/documentation/TechnicalIntroduction.pdf
http://www.openexr.com/documentation/OpenEXRColorManagement.pdf

An advantage to starting out with constraint, you can much more easily
implement lower precision levels, like 16bpc float or even integer.

> Or do I have false assumptions about HDR specifications and they do
> not define brightness in physical absolute units but somehow in
> relative units? I think I saw "nit" as the unit somewhere which is an
> absolute physical unit.

It depends on which part of specifications you're looking at. The
reference environment, and reference medium are definitely defined in
absolute terms. The term "nit" is the same thing as the candela per
square meter (cd/m^2), and that's the unit for luminance. Display
black luminance and white luminance use this unit. The environment
will use the SI unit lux. The nit is used for projected light, and lux
used for light incident to or emitted from a surface (ceiling, walls,
floor, etc).

In the SDR world including an ICCv4 world, the display class profile
uses relative values: lightness. Not luminance. Even when encoding
XYZ, the values are all relative to that display's white, where Y =
1.0. So yeah for HDR that information is useless and is one of the
gotchas with ICC display class profiles. There are optional tags
defined in the spec for many years now to include measured display
black and white luminance. For HDR applications it would seem it'd
have to be required information. Another gotcha that has been mostly
sorted out I think, is whether the measurements are so called
"contact" or "no contact" measurements, i.e. a contact measurement
won't account for veiling glare, which is the effect of ambient light
reflecting off the surface of the display thereby increasing the
effective display's black luminance. A no contact measurement will
account for it. You might think, the no contact measurement is better.
Well, yeah, maybe in a production environment where everything is
measured and stabilized.

But in a home, you might actually want to estimate veiling glare and
apply it to a no contact display black luminance measurement. Maybe
you have a setting in a player with simple ambient descriptors as
"dark" "moderate" "bright" amounts of ambient condition. The choices
made for handling HDR content in such a case are rather substantially
different. And if this could be done by polling an inexpensive sensor
in the environment, for example a camera on the display, so much the
better. Maybe.

> It might be heavy to use, both storage wise and computationally, but I
> think Weston should start with a gold standard approach that we can
> verify to be correct, encode the behaviour into the test suite, and
> then look at possible optimizations by looking at e.g. other blending
> spaces or opportunistically skipping the blending space.
>
> Would that color space work universally from the colorimetry and
> precision perspective, with any kind of gamut one might want/have, and
> so on?

The compositor is doing what kind of blending for what purpose? I'd
expect any professional video rendering software will do this in their
own defined color space, encoding, and precision - and it all happens
internally. It might be a nice API so that applications don't have to
keep reinventing that particular wheel and doing it internally.

In the near term do you really expect you need blending beyond
Rec.2020/Rec.2100? Rec.2020/Rec.2100 is not so big that transforms to
Rec.709 will require special gamut mapping consideration. But I'm open
to other ideas.

Blender, DaVinci, Lightworks, GIMP or GEGL, and Darktable folks might
have some input here.

-- 
Chris Murphy
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: HDR support in Wayland/Weston

2019-02-28 Thread Chris Murphy
On Thu, Feb 28, 2019 at 2:35 AM Pekka Paalanen  wrote:
>
> On Wed, 27 Feb 2019 13:47:07 -0700
> Chris Murphy  wrote:
>
> > On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> > >
> > > there is a single, unambiguous answer on Wayland: the compositor owns
> > > the pipeline. Therefore we won't have the kind of problems you describe
> > > above.
> > >
> > > These are the very reasons I am against adding any kind of protocol
> > > extension that would allow a client to directly touch the pipeline or
> > > to bypass the compositor.
> >
> > Well you need a client to do display calibration which necessarily
> > means altering the video LUT (to linear) in order to do the
> > measurements from which a correction curve is computed, and then that
> > client needs to install that curve into the video LUT. Now, colord
> > clearly has such capability, as it's applying vcgt tags in ICC
> > profiles now. If colord can do it, then what prevents other clients
> > from doing it?
>
> Hi Chris,
>
> there is no need to expose hardware knobs like LUT etc. directly in
> protocol even for measuring. We can have a special, privileged protocol
> extension for measurement apps, where the measuring intent is explicit,
> and the compositor can prepare the hardware correctly. This also avoids
> updating the measurement apps to follow the latest hardware features
> which the compositor might be using already. An old measurement app
> could be getting wrong result because it didn't know how to reset a new
> part in the pipeline that the compositor is using.
>
> Hence the compositor owns the pipeline at all times.
>
> Permanently setting the new pipeline parameters is compositor
> configuration - you wouldn't want to have to run the measurement app on
> every boot to just install the right parameters. Compositor
> configuration is again compositor specific. The privileged protocol
> extension could have a way to deliver the new output color profile to
> the compositor, for the compositor to save and apply it with any
> methods it happens to use. You cannot assume that the compositor will
> actually program "the hardware LUT" to achieve that, since there are
> many other ways to achieve the same and hardware capabilities vary.
> Nowadays there is often much more than just one LUT. Furthermore, in my
> recent reply to Niels' color management extension proposal I derived a
> case from the proposal where a compositor would be forced to use an
> identity LUT and instead do all transformation during rendering.
>
> Colord is not a client. Colord is currently called from a
> weston plugin, the plugin has access to the compositor internal API to
> set up a LUT. Colord cannot do it on its own.

That all seems reasonable.

I'm curious how legacy applications including games used to manipulate
actual hardware LUT in a video card, if the application asked the
client to do it, in which case it still could do that?

Also I'm curious about the multiple display use case. I think it's
quite a lot to ask a client to know about multiple transforms for
multiple displays and which pixels to transform and how, based on
which display those pixels are currently displayed on, and then
somehow to intelligently cache this so it doesn't bog down the whole
system. A use case in particular I'm thinking of is Firefox, where you
really don't want to have to constantly do transforms of everything,
every time a pixel scrolls away and vanishes, but when it reappears
it's reappearing on a different display. And also you'd want the pixel
to look correct from the very instant it appears on a different
display and is correct upon appearing back on the original display or
even looks correct in a split/mirrow window scenario, laptop display +
projector.

At least on macOS and for the most part Windows, most applications
aren't color management aware, and just assume deviceRGB color for
everything; and at least on macOS by default, and Windows as an
option, it's possible for the window manager to substitute what is
really "legacy deviceRGB" for sRGB as an intermediate space and from
there properly do display compensation for pixels on whatever display
they appear on. Ergo, a display calibration app does need a way to
announce its ability so that its test chart isn't being assumed to be
sRGB (or whatever), and a smarter color managed application needs a
way of saying one of two things:
a. I've already done color management, I *really do* need deviceRGB
b. display this, its color space is _.

Both types of applications exist. It might very well be reasonable to
say, yeah we're not going to support use case a.) Such smarter
applications are going to have to do their color management however
they want internally, and transform to a normalized color space like
P3 or Rec.2020 or opRGB and follow use case b.) where they tag all
content with that normalized color space.

And all of this has an equivalent path and transform for printing, and
how to get sane output whether disp

Re: [RFC wayland-protocols v2 1/1] Add the color-management protocol

2019-02-28 Thread Kai-Uwe
Am 28.02.19 um 12:37 schrieb Pekka Paalanen:
> On Thu, 28 Feb 2019 09:12:57 +0100
> Kai-Uwe  wrote:
>
>> Am 27.02.19 um 14:17 schrieb Pekka Paalanen:
>>> On Tue, 26 Feb 2019 18:56:06 +0100
>>> Kai-Uwe  wrote:
>>>  
 Am 26.02.19 um 16:48 schrieb Pekka Paalanen:  
> On Sun, 22 Jan 2017 13:31:35 +0100
> Niels Ole Salscheider  wrote:
>
>> Signed-off-by: Niels Ole Salscheider   
>>
>> +
>> +  
>> +With this request, a device link profile can be attached to a
>> +wl_surface. For each output on which the surface is visible, the
>> +compositor will check if there is a device link profile. If 
>> there is one
>> +it will be used to directly convert the surface to the output 
>> color
>> +space. Blending of this surface (if necessary) will then be 
>> performed in
>> +the output color space and after the normal blending 
>> operations.
> Are those blending rules actually implementable?
>
> It is not generally possible to blend some surfaces into a temporary
> buffer, convert that to the next color space, and then blend some more,
> because the necessary order of blending operations depends on the
> z-order of the surfaces.
>
> What implications does this have on the CRTC color processing pipeline?
>
> If a CRTC color processing pipeline, that is, the transformation from
> framebuffer values to on-the-wire values for a monitor, is already set
> up by the compositor's preference, what would a device link profile
> look like? Does it produce on-the-wire or blending space?
>
> If the transformation defined by the device link profile produced
> values for the monitor wire, then the compositor will have to undo the
> CRTC pipeline transformation during composition for this surface, or it
> needs to reset CRTC pipeline setup to identity and apply it manually
> for all other surfaces.
>
> What is the use for a device link profile?
 A device link profile is useful to describe a transform from a buffer to
 a match one specific output. Device links can give a very fine grained
 control to applications to decide what they want with their colors. This
 is useful in case a application want to circumvent the default gamut
 mapping optimise for each output connected to a computer or add color
 effects like proofing. The intelligence is inside the device link
 profile and the compositor applies that as a dump rule.  
>>> Hi Kai-Uwe,
>>>
>>> right, thank you. I did get the feeling right on what it is supposed to
>>> do, but I have hard time imagining how to implement that in a compositor
>>> that also needs to cater for other windows on the same output and blend
>>> them all together correctly.
>>>
>>> Even without blending, it means that the CRTC color manipulation
>>> features cannot really be used at all, because there are two
>>> conflicting transformations to apply: from compositor internal
>>> (blending) space to the output space, and from the application content
>>> space through the device link profile to the output space. The only
>>> way that could be realized without any additional reverse
>>> transformations is that the CRTC is set as an identity pass-through,
>>> and both kinds of transformations are done in the composite rendering
>>> with OpenGL or Vulkan.  
>> What are CRTC color manipulation features in wayland? blending?
Hello Pekka,
> Wayland exposes nothing of CRTC capabilities. I think that is the best.
>
> Blending windows together is implicit from allowing pixel formats with
> alpha. Even then, from the client perspective such blending is limited
> to sub-surfaces, since those are all a client is aware of.
...
>>> If we want device link profiles in the protocol, then I think that is
>>> the cost we have to pay. But that is just about performance, while to
>>> me it seems like correct blending would be impossible to achieve if
>>> there was another translucent window on top of the window using a
>>> device link profile. Or even worse, a stack like this:
>>>
>>> window B (color profile)
>>> window A (device link profile)
>>> wallpaper (color profile)  
>> Thanks for the simplification.
>>
>>> If both windows have translucency somewhere, they must be blended in
>>> that order. The blending of window A cannot be postponed after the
>>> others.  
>> Remembers me on the discussions we had with the Cairo people years ago.
> Was the conclusion the same, or have I mistaken something?

My general impression was, with the need to fit outside requirements
(early color managed colors) came in conflict with concepts of Cairo.
The corner cases where the conflict of blending output referred colors,
which need to be occasionally in blending space, like you pointed out
for Wayland too. (But that is reasonably solved in other API's too.
Imagine Postscript suddenly presenting transpa

Re: HDR support in Wayland/Weston

2019-02-28 Thread Adam Jackson
On Wed, 2019-02-27 at 13:47 -0700, Chris Murphy wrote:
> On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> > there is a single, unambiguous answer on Wayland: the compositor owns
> > the pipeline. Therefore we won't have the kind of problems you describe
> > above.
> > 
> > These are the very reasons I am against adding any kind of protocol
> > extension that would allow a client to directly touch the pipeline or
> > to bypass the compositor.
> 
> Well you need a client to do display calibration which necessarily
> means altering the video LUT (to linear) in order to do the
> measurements from which a correction curve is computed, and then that
> client needs to install that curve into the video LUT. Now, colord
> clearly has such capability, as it's applying vcgt tags in ICC
> profiles now. If colord can do it, then what prevents other clients
> from doing it?

The wayland server is capable of knowing the process on the other end
of the socket, and only exposing the color management control protocol
to specifically blessed clients.

- ajax

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: [PATCH] unstable: add HDR Mastering Meta-data Protocol

2019-02-28 Thread Pekka Paalanen
On Wed, 27 Feb 2019 10:27:16 +0530
"Nautiyal, Ankit K"  wrote:

> From: Ankit Nautiyal 
> 
> This protcol enables a client to send the hdr meta-data:
> MAX-CLL, MAX-FALL, Max Luminance and Min Luminance as defined by
> SMPTE ST.2086.
> The clients get these values for an HDR video, encoded for a video
> stream/file. MAX-CLL (Maximum Content Light Level) tells the brightest
> pixel in the entire stream/file in nits.
> MAX-FALL (Maximum Frame Average Light Level) tells the highest frame
> average brightness in nits for a single frame. Max and Min Luminance
> tells the max/min Luminance for the mastering display.
> These values give an idea of the brightness of the video which can be
> used by displays, so that they can adjust themselves for a better
> viewing experience.
> 
> The protocol depends on the Color Management Protocol which is still
> under review. There are couple of propsed protocols by Neils Ole [1],
> and Sebastian Wick [2], which allow a client to select a color-space
> for a surface, via ICC color profile files.
> The client is expected to set the color space using the icc files and
> the color-management protocol.
> 
> [1] https://patchwork.freedesktop.org/patch/134570/
> [2] https://patchwork.freedesktop.org/patch/286062/
> 
> Co-authored-by: Harish Krupo 
> Signed-off-by: Ankit Nautiyal 

Hi Ankit,

thanks for working on this, comments inline.

I do wonder if this should just be baked into a color management
extension directly. More on that below.

> ---
>  Makefile.am|  1 +
>  unstable/hdr-mastering-metadata/README |  5 ++
>  .../hdr-mastering-metadata-unstable-v1.xml | 95 
> ++
>  3 files changed, 101 insertions(+)
>  create mode 100644 unstable/hdr-mastering-metadata/README
>  create mode 100644 
> unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml
> 
> diff --git a/Makefile.am b/Makefile.am
> index 345ae6a..c097080 100644
> --- a/Makefile.am
> +++ b/Makefile.am
> @@ -23,6 +23,7 @@ unstable_protocols =
> \
>   unstable/xdg-decoration/xdg-decoration-unstable-v1.xml  \
>   
> unstable/linux-explicit-synchronization/linux-explicit-synchronization-unstable-v1.xml
>  \
>   unstable/primary-selection/primary-selection-unstable-v1.xml
> \
> + unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml  
> \
>   $(NULL)
>  
>  stable_protocols =   
> \
> diff --git a/unstable/hdr-mastering-metadata/README 
> b/unstable/hdr-mastering-metadata/README
> new file mode 100644
> index 000..b567860
> --- /dev/null
> +++ b/unstable/hdr-mastering-metadata/README
> @@ -0,0 +1,5 @@
> +HDR-MASTERING-META-DATA-PROTOCOL
> +
> +Maintainers:
> +Ankit Nautiyal 
> +Harish Krupo 
> diff --git 
> a/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml 
> b/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml
> new file mode 100644
> index 000..aeddf39
> --- /dev/null
> +++ b/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml

Could it be named hdr-mastering rather than hdr-mastering-metadata?
Shorter C function names wouldn't hurt.

> @@ -0,0 +1,95 @@
> +
> +
> +
> +  
> +Copyright © 2019 Intel
> +
> +Permission is hereby granted, free of charge, to any person obtaining a
> +copy of this software and associated documentation files (the 
> "Software"),
> +to deal in the Software without restriction, including without limitation
> +the rights to use, copy, modify, merge, publish, distribute, sublicense,
> +and/or sell copies of the Software, and to permit persons to whom the
> +Software is furnished to do so, subject to the following conditions:
> +
> +The above copyright notice and this permission notice (including the next
> +paragraph) shall be included in all copies or substantial portions of the
> +Software.
> +
> +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 
> OR
> +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
> +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
> +THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR 
> OTHER
> +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
> +FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
> +DEALINGS IN THE SOFTWARE.
> +  
> +
> +  

I think this chapter should explicitly refer to the SMPTE
specification. You did it in the commit message, but I think it would be
appropriate here.

The commit message explains a lot of what this is. The commit message
should concentrate on why this extension is needed and why it is like
this, and leave the what for the protocol documentation.

> +This protocol provides the ability to specify the mastering color vo

Re: [RFC wayland-protocols 1/1] unstable: add color management protocol

2019-02-28 Thread Pekka Paalanen
On Thu, 14 Feb 2019 03:46:21 +0100
Sebastian Wick  wrote:

> This protocol allows clients to attach a color space and rendering
> intent to a wl_surface. It further allows the client to be informed
> which color spaces a wl_surface was converted to on the last presented.
> 
> Signed-off-by: Sebastian Wick 

Hi Sebastian,

thank you very much for working on this!

It would have been nice if you referred to Neils' proposal and maybe
compared this to that, especially if something here was modelled after
Neils' proposal.

Comments inline.

> ---
>  Makefile.am   |   1 +
>  unstable/color-management/README  |   4 +
>  .../color-management-unstable-v1.xml  | 183 ++
>  3 files changed, 188 insertions(+)
>  create mode 100644 unstable/color-management/README
>  create mode 100644 unstable/color-management/color-management-unstable-v1.xml
> 
> diff --git a/Makefile.am b/Makefile.am
> index 345ae6a..80abc1d 100644
> --- a/Makefile.am
> +++ b/Makefile.am
> @@ -23,6 +23,7 @@ unstable_protocols =
> \
>   unstable/xdg-decoration/xdg-decoration-unstable-v1.xml  \
>   
> unstable/linux-explicit-synchronization/linux-explicit-synchronization-unstable-v1.xml
>  \
>   unstable/primary-selection/primary-selection-unstable-v1.xml
> \
> + unstable/color-management/color-management-unstable-v1.xml  
> \
>   $(NULL)
>  
>  stable_protocols =   
> \
> diff --git a/unstable/color-management/README 
> b/unstable/color-management/README
> new file mode 100644
> index 000..140f1e9
> --- /dev/null
> +++ b/unstable/color-management/README
> @@ -0,0 +1,4 @@
> +Color management protocol
> +
> +Maintainers:
> +Sebastian Wick 
> diff --git a/unstable/color-management/color-management-unstable-v1.xml 
> b/unstable/color-management/color-management-unstable-v1.xml
> new file mode 100644
> index 000..1615fe6
> --- /dev/null
> +++ b/unstable/color-management/color-management-unstable-v1.xml
> @@ -0,0 +1,183 @@
> +
> +
> +
> +  
> +Copyright © 2019 Sebastian Wick
> +Copyright © 2019 Erwin Burema
> +
> +Permission is hereby granted, free of charge, to any person obtaining a
> +copy of this software and associated documentation files (the 
> "Software"),
> +to deal in the Software without restriction, including without limitation
> +the rights to use, copy, modify, merge, publish, distribute, sublicense,
> +and/or sell copies of the Software, and to permit persons to whom the
> +Software is furnished to do so, subject to the following conditions:
> +
> +The above copyright notice and this permission notice (including the next
> +paragraph) shall be included in all copies or substantial portions of the
> +Software.
> +
> +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS 
> OR
> +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
> +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
> +THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR 
> OTHER
> +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
> +FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
> +DEALINGS IN THE SOFTWARE.
> +  
> +
> +  
> +This protocol provides the ability to specify the color space
> +of a wl_surface. If further enumerates the color spaces currently
> +in the system and allows to query feedback about which color spaces
> +a wl_surface was converted to on the last present.
> +The idea behind the feedback system is to allow the client to do color
> +conversion to a color space which will likely be used to show the surface
> +which allows the compositor to skip the color conversion step.
> +  
> +
> +  
> +
> +  The color manager is a singleton global object that provides access
> +  to outputs' color spaces, let's you create new color spaces and attach
> +  a color space to surfaces.
> +

This interface is missing the destroy request.

> +
> +
> +  
> +render intents
> +  
> +  
> +  
> +  
> +  
> +
> +
> +
> +  
> +Well-known color spaces

Is a compositor required to send zwp_color_space_v1 objects for all of
these always?

Is the intention here that the compositor sends zwp_color_space_v1
objects for all those it happens to have the ICC profile at hand, and
if a client wants to use a yet another profile which is listed in this
enum, it still needs to provide it as an ICC profile file because the
compositor otherwise could not interpret it?

What if no output nor the compositor itself uses a profile listed here,
but it is still available to the compositor? E.g. sRGB when the
compositor only uses device specific profiles for all outputs and uses

Re: [RFC wayland-protocols v2 1/1] Add the color-management protocol

2019-02-28 Thread Pekka Paalanen
On Thu, 28 Feb 2019 09:12:57 +0100
Kai-Uwe  wrote:

> Am 27.02.19 um 14:17 schrieb Pekka Paalanen:
> > On Tue, 26 Feb 2019 18:56:06 +0100
> > Kai-Uwe  wrote:
> >  
> >> Am 26.02.19 um 16:48 schrieb Pekka Paalanen:  
> >>> On Sun, 22 Jan 2017 13:31:35 +0100
> >>> Niels Ole Salscheider  wrote:
> >>>
>  Signed-off-by: Niels Ole Salscheider   
> 
>  +
>  +  
>  +With this request, a device link profile can be attached to a
>  +wl_surface. For each output on which the surface is visible, the
>  +compositor will check if there is a device link profile. If 
>  there is one
>  +it will be used to directly convert the surface to the output 
>  color
>  +space. Blending of this surface (if necessary) will then be 
>  performed in
>  +the output color space and after the normal blending 
>  operations.
> >>> Are those blending rules actually implementable?
> >>>
> >>> It is not generally possible to blend some surfaces into a temporary
> >>> buffer, convert that to the next color space, and then blend some more,
> >>> because the necessary order of blending operations depends on the
> >>> z-order of the surfaces.
> >>>
> >>> What implications does this have on the CRTC color processing pipeline?
> >>>
> >>> If a CRTC color processing pipeline, that is, the transformation from
> >>> framebuffer values to on-the-wire values for a monitor, is already set
> >>> up by the compositor's preference, what would a device link profile
> >>> look like? Does it produce on-the-wire or blending space?
> >>>
> >>> If the transformation defined by the device link profile produced
> >>> values for the monitor wire, then the compositor will have to undo the
> >>> CRTC pipeline transformation during composition for this surface, or it
> >>> needs to reset CRTC pipeline setup to identity and apply it manually
> >>> for all other surfaces.
> >>>
> >>> What is the use for a device link profile?
> >> A device link profile is useful to describe a transform from a buffer to
> >> a match one specific output. Device links can give a very fine grained
> >> control to applications to decide what they want with their colors. This
> >> is useful in case a application want to circumvent the default gamut
> >> mapping optimise for each output connected to a computer or add color
> >> effects like proofing. The intelligence is inside the device link
> >> profile and the compositor applies that as a dump rule.  
> > Hi Kai-Uwe,
> >
> > right, thank you. I did get the feeling right on what it is supposed to
> > do, but I have hard time imagining how to implement that in a compositor
> > that also needs to cater for other windows on the same output and blend
> > them all together correctly.
> >
> > Even without blending, it means that the CRTC color manipulation
> > features cannot really be used at all, because there are two
> > conflicting transformations to apply: from compositor internal
> > (blending) space to the output space, and from the application content
> > space through the device link profile to the output space. The only
> > way that could be realized without any additional reverse
> > transformations is that the CRTC is set as an identity pass-through,
> > and both kinds of transformations are done in the composite rendering
> > with OpenGL or Vulkan.  

> What are CRTC color manipulation features in wayland? blending?

Hi Kai-Uwe,

Wayland exposes nothing of CRTC capabilities. I think that is the best.

Blending windows together is implicit from allowing pixel formats with
alpha. Even then, from the client perspective such blending is limited
to sub-surfaces, since those are all a client is aware of.

> > If we want device link profiles in the protocol, then I think that is
> > the cost we have to pay. But that is just about performance, while to
> > me it seems like correct blending would be impossible to achieve if
> > there was another translucent window on top of the window using a
> > device link profile. Or even worse, a stack like this:
> >
> > window B (color profile)
> > window A (device link profile)
> > wallpaper (color profile)  
> 
> Thanks for the simplification.
> 
> > If both windows have translucency somewhere, they must be blended in
> > that order. The blending of window A cannot be postponed after the
> > others.  
> 
> Remembers me on the discussions we had with the Cairo people years ago.

Was the conclusion the same, or have I mistaken something?

> > I guess that implies that if even one surface on an output uses a
> > device link profile, then all blending must be done in the output color
> > space instead of an intermediate blending space. Is that an acceptable
> > trade-off?  
> 
> It will make "high quality" apps look like blending fun stoppers. Not so
> nice. In contrast, the conversion back from output space to blending
> space then blending and then conversion to output will maintain t

Re: [PATCH] unstable: add HDR Mastering Meta-data Protocol

2019-02-28 Thread Nautiyal, Ankit K

Hi,

On 2/27/2019 2:28 PM, Erwin Burema wrote:

Hi,

On Wed, 27 Feb 2019 at 05:47, Nautiyal, Ankit K
 wrote:

From: Ankit Nautiyal 

This protcol enables a client to send the hdr meta-data:
MAX-CLL, MAX-FALL, Max Luminance and Min Luminance as defined by
SMPTE ST.2086.
The clients get these values for an HDR video, encoded for a video
stream/file. MAX-CLL (Maximum Content Light Level) tells the brightest
pixel in the entire stream/file in nits.
MAX-FALL (Maximum Frame Average Light Level) tells the highest frame
average brightness in nits for a single frame. Max and Min Luminance
tells the max/min Luminance for the mastering display.
These values give an idea of the brightness of the video which can be
used by displays, so that they can adjust themselves for a better
viewing experience.


This does sound quite good for video players (and in the future image
viewers), but might have something missing for HDR content generation
(Blender, Krita, Natron, etc) since you do not always know these
values in advance (it is effectively before mastering), most of these
work in scene linear with the convention that 0.18 is middle grey
(although this is just a convention). So I think that in these cases
we might need to get info from the display system on what luminance
levels its supports.

Hope this makes sense since at the way to busy with other obligations
so not much time to look into this.


Yes you are right, there would be cases where HDR content-does not have 
these values.
The HDR luminance levels for a display are exposed to the compositor 
through the edid from the kernel.
We can have a discussion whether a displays luminance level can be 
exposed by the compositor to the client and also how to do it.
But currently, the present patch is focused more for HDR video players, 
as it will be difficult to implement and
review all the scenarios/requirements at one go. Once the things set and 
agreed upon for this case,
we can add the support to expose these luminance values too as per 
discussions.

Does that make sense?

Regards,
Ankit


The protocol depends on the Color Management Protocol which is still
under review. There are couple of propsed protocols by Neils Ole [1],
and Sebastian Wick [2], which allow a client to select a color-space
for a surface, via ICC color profile files.
The client is expected to set the color space using the icc files and
the color-management protocol.

[1] https://patchwork.freedesktop.org/patch/134570/
[2] https://patchwork.freedesktop.org/patch/286062/

Co-authored-by: Harish Krupo 
Signed-off-by: Ankit Nautiyal 
---
  Makefile.am|  1 +
  unstable/hdr-mastering-metadata/README |  5 ++
  .../hdr-mastering-metadata-unstable-v1.xml | 95 ++
  3 files changed, 101 insertions(+)
  create mode 100644 unstable/hdr-mastering-metadata/README
  create mode 100644 
unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml

diff --git a/Makefile.am b/Makefile.am
index 345ae6a..c097080 100644
--- a/Makefile.am
+++ b/Makefile.am
@@ -23,6 +23,7 @@ unstable_protocols =  
\
 unstable/xdg-decoration/xdg-decoration-unstable-v1.xml  \
 
unstable/linux-explicit-synchronization/linux-explicit-synchronization-unstable-v1.xml
 \
 unstable/primary-selection/primary-selection-unstable-v1.xml   
 \
+   unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml  
\
 $(NULL)

  stable_protocols =
 \
diff --git a/unstable/hdr-mastering-metadata/README 
b/unstable/hdr-mastering-metadata/README
new file mode 100644
index 000..b567860
--- /dev/null
+++ b/unstable/hdr-mastering-metadata/README
@@ -0,0 +1,5 @@
+HDR-MASTERING-META-DATA-PROTOCOL
+
+Maintainers:
+Ankit Nautiyal 
+Harish Krupo 
diff --git 
a/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml 
b/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml
new file mode 100644
index 000..aeddf39
--- /dev/null
+++ b/unstable/hdr-mastering-metadata/hdr-mastering-metadata-unstable-v1.xml
@@ -0,0 +1,95 @@
+
+
+
+  
+Copyright © 2019 Intel
+
+Permission is hereby granted, free of charge, to any person obtaining a
+copy of this software and associated documentation files (the "Software"),
+to deal in the Software without restriction, including without limitation
+the rights to use, copy, modify, merge, publish, distribute, sublicense,
+and/or sell copies of the Software, and to permit persons to whom the
+Software is furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice (including the next
+paragraph) shall be included in all copies or substantial portions of the
+Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMP

Re: HDR support in Wayland/Weston

2019-02-28 Thread Pekka Paalanen
On Wed, 27 Feb 2019 13:47:07 -0700
Chris Murphy  wrote:

> On Wed, Feb 27, 2019 at 5:27 AM Pekka Paalanen  wrote:
> >
> > there is a single, unambiguous answer on Wayland: the compositor owns
> > the pipeline. Therefore we won't have the kind of problems you describe
> > above.
> >
> > These are the very reasons I am against adding any kind of protocol
> > extension that would allow a client to directly touch the pipeline or
> > to bypass the compositor.  
> 
> Well you need a client to do display calibration which necessarily
> means altering the video LUT (to linear) in order to do the
> measurements from which a correction curve is computed, and then that
> client needs to install that curve into the video LUT. Now, colord
> clearly has such capability, as it's applying vcgt tags in ICC
> profiles now. If colord can do it, then what prevents other clients
> from doing it?

Hi Chris,

there is no need to expose hardware knobs like LUT etc. directly in
protocol even for measuring. We can have a special, privileged protocol
extension for measurement apps, where the measuring intent is explicit,
and the compositor can prepare the hardware correctly. This also avoids
updating the measurement apps to follow the latest hardware features
which the compositor might be using already. An old measurement app
could be getting wrong result because it didn't know how to reset a new
part in the pipeline that the compositor is using.

Hence the compositor owns the pipeline at all times.

Permanently setting the new pipeline parameters is compositor
configuration - you wouldn't want to have to run the measurement app on
every boot to just install the right parameters. Compositor
configuration is again compositor specific. The privileged protocol
extension could have a way to deliver the new output color profile to
the compositor, for the compositor to save and apply it with any
methods it happens to use. You cannot assume that the compositor will
actually program "the hardware LUT" to achieve that, since there are
many other ways to achieve the same and hardware capabilities vary.
Nowadays there is often much more than just one LUT. Furthermore, in my
recent reply to Niels' color management extension proposal I derived a
case from the proposal where a compositor would be forced to use an
identity LUT and instead do all transformation during rendering.

Colord is not a client. Colord is currently called from a
weston plugin, the plugin has access to the compositor internal API to
set up a LUT. Colord cannot do it on its own.


Thanks,
pq


pgp4phzQA_c8v.pgp
Description: OpenPGP digital signature
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/wayland-devel

Re: [RFC wayland-protocols v2 1/1] Add the color-management protocol

2019-02-28 Thread Kai-Uwe
Am 27.02.19 um 14:17 schrieb Pekka Paalanen:
> On Tue, 26 Feb 2019 18:56:06 +0100
> Kai-Uwe  wrote:
>
>> Am 26.02.19 um 16:48 schrieb Pekka Paalanen:
>>> On Sun, 22 Jan 2017 13:31:35 +0100
>>> Niels Ole Salscheider  wrote:
>>>  
 Signed-off-by: Niels Ole Salscheider   

 +
 +  
 +With this request, a device link profile can be attached to a
 +wl_surface. For each output on which the surface is visible, the
 +compositor will check if there is a device link profile. If there 
 is one
 +it will be used to directly convert the surface to the output 
 color
 +space. Blending of this surface (if necessary) will then be 
 performed in
 +the output color space and after the normal blending operations.  
>>> Are those blending rules actually implementable?
>>>
>>> It is not generally possible to blend some surfaces into a temporary
>>> buffer, convert that to the next color space, and then blend some more,
>>> because the necessary order of blending operations depends on the
>>> z-order of the surfaces.
>>>
>>> What implications does this have on the CRTC color processing pipeline?
>>>
>>> If a CRTC color processing pipeline, that is, the transformation from
>>> framebuffer values to on-the-wire values for a monitor, is already set
>>> up by the compositor's preference, what would a device link profile
>>> look like? Does it produce on-the-wire or blending space?
>>>
>>> If the transformation defined by the device link profile produced
>>> values for the monitor wire, then the compositor will have to undo the
>>> CRTC pipeline transformation during composition for this surface, or it
>>> needs to reset CRTC pipeline setup to identity and apply it manually
>>> for all other surfaces.
>>>
>>> What is the use for a device link profile?  
>> A device link profile is useful to describe a transform from a buffer to
>> a match one specific output. Device links can give a very fine grained
>> control to applications to decide what they want with their colors. This
>> is useful in case a application want to circumvent the default gamut
>> mapping optimise for each output connected to a computer or add color
>> effects like proofing. The intelligence is inside the device link
>> profile and the compositor applies that as a dump rule.
> Hi Kai-Uwe,
>
> right, thank you. I did get the feeling right on what it is supposed to
> do, but I have hard time imagining how to implement that in a compositor
> that also needs to cater for other windows on the same output and blend
> them all together correctly.
>
> Even without blending, it means that the CRTC color manipulation
> features cannot really be used at all, because there are two
> conflicting transformations to apply: from compositor internal
> (blending) space to the output space, and from the application content
> space through the device link profile to the output space. The only
> way that could be realized without any additional reverse
> transformations is that the CRTC is set as an identity pass-through,
> and both kinds of transformations are done in the composite rendering
> with OpenGL or Vulkan.
What are CRTC color manipulation features in wayland? blending?
> If we want device link profiles in the protocol, then I think that is
> the cost we have to pay. But that is just about performance, while to
> me it seems like correct blending would be impossible to achieve if
> there was another translucent window on top of the window using a
> device link profile. Or even worse, a stack like this:
>
> window B (color profile)
> window A (device link profile)
> wallpaper (color profile)

Thanks for the simplification.

> If both windows have translucency somewhere, they must be blended in
> that order. The blending of window A cannot be postponed after the
> others.

Remembers me on the discussions we had with the Cairo people years ago.

> I guess that implies that if even one surface on an output uses a
> device link profile, then all blending must be done in the output color
> space instead of an intermediate blending space. Is that an acceptable
> trade-off?

It will make "high quality" apps look like blending fun stoppers. Not so
nice. In contrast, the conversion back from output space to blending
space then blending and then conversion to output will maintain the
blending space experience at some performance cost and break the
original device link intent. Would that fit a trade-off for you? (So app
client developers should never use translucent portions. However the
toolkit or compositor might enforce this, e.g. for window decoration,
that outside translucency would break the app intention.)

> Does that even make any difference if the output space was linear at
> blending step, and gamma was applied after that?

Interesting. There came the argument that the complete graphics pipeline
should be used for measuring the device for profile generation. So, if
linear