Re: The state of Quantization Range handling

2022-11-21 Thread Pekka Paalanen
On Fri, 18 Nov 2022 15:53:29 +
Dave Stevenson  wrote:

> Hi Pekka
> 
> On Fri, 18 Nov 2022 at 10:15, Pekka Paalanen  wrote:
> >
> > On Thu, 17 Nov 2022 22:13:26 +0100
> > Sebastian Wick  wrote:
> >  
> > > Hi Dave,
> > >
> > > I noticed that I didn't get the Broadcast RGB property thanks to you
> > > (more below)
> > >
> > > On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
> > >  wrote:  
> > > >
> > > > Hi Sebastian
> > > >
> > > > Thanks for starting the conversation - it's stalled a number of times
> > > > previously.
> > > >
> > > > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick 
> > > >  wrote:  
> > > > >
> > > > > There are still regular bug reports about monitors (sinks) and sources
> > > > > disagreeing about the quantization range of the pixel data. In
> > > > > particular sources sending full range data when the sink expects
> > > > > limited range. From a user space perspective, this is all hidden in
> > > > > the kernel. We send full range data to the kernel and then hope it
> > > > > does the right thing but as the bug reports show: some combinations of
> > > > > displays and drivers result in problems.  
> > > >
> > > > I'll agree that we as Raspberry Pi also get a number of bug reports
> > > > where sinks don't always look at the infoframes and misinterpret the
> > > > data.
> > > >  
> > > > > In general the whole handling of the quantization range on linux is
> > > > > not defined or documented at all. User space sends full range data
> > > > > because that's what seems to work most of the time but technically
> > > > > this is all undefined and user space can not fix those issues. Some
> > > > > compositors have resorted to giving users the option to choose the
> > > > > quantization range but this really should only be necessary for
> > > > > straight up broken hardware.  
> > > >
> > > > Wowsers! Making userspace worry about limited range data would be a
> > > > very weird decision in my view, so compositors should always deal in
> > > > full range data.  
> > >
> > > Making this a user space problem is IMO the ideal way to deal with it
> > > but that's a bit harder to do (I'll answer that in the reply to
> > > Pekka). So let's just assume we all agree that user space only deals
> > > with full range data.  
> >
> > Limited range was invented for some reason, so it must have some use
> > somewhere, at least in the past. Maybe it was needed to calibrate mixed
> > digital/analog video processing chains with test images that needed to
> > contain sub-blacks and super-whites, to make sure that sub-blacks come
> > out as the nominal black etc. Just because desktop computers do not
> > seem to have any need for limited range, I personally wouldn't be as
> > arrogant as to say it's never useful. Maybe there are professional
> > video/broadcasting needs that currently can only be realized with
> > proprietary OS/hardware, because Linux just can't do it today?
> >
> > Why would TVs support limited range, if it was never useful? Why would
> > video sources produce limited range if it was always strictly inferior
> > to full range?
> >
> > Even digital image processing algorithms might make use of
> > out-of-unit-range values, not just analog circuitry for overshoot.
> >
> > But no, I can't give a real example, just speculation. Hence it's fine
> > by me to discard limited range processing for now. Still, what I
> > explain below would allow limited range processing without any extra
> > complexity by making the KMS color pipeline better defined and less
> > limiting for userspace.  
> 
> AIUI limited range comes from the analogue world, or possibly creative
> (film/TV) world, hence being used on Consumer devices rather than IT
> ones (CTA and CEA modes vs VESA and DMT modes).
> 
> YCbCr output from video codecs typically uses a range of 16-235,
> therefore a media player wanting to pass the decoded video out to the
> display exactly as-is needs to be able to signal that to the display
> for it to be interpreted correctly.
> 
> HDMI extended DVI. I believe both YCbCr support and range control were
> added to the HDMI spec at the same time, presumably to allow for this
> use case. Limited range RGB seems to be a bit of a quirk though.
> 
> Just to be annoying, JPEG uses full range YCbCr.
> 
> > > > How would composition of multiple DRM planes work if some are limited
> > > > range and some are full but you want limited range output? Your
> > > > hardware needs to have CSC matrices to convert full range down to
> > > > limited range, and know that you want to use them to effectively
> > > > compose to limited range.
> > > > In fact you can't currently tell DRM that an RGB plane is limited
> > > > range - the values in enum drm_color_range are
> > > > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].  
> >
> > Yeah, that's because range conversion has been conflated with
> > YUV-to-RGB conversion, and the result is always full-range RGB in
> > practise, AFAIU. There is no way to feed

Re: The state of Quantization Range handling

2022-11-18 Thread Harry Wentland



On 11/18/22 10:53, Dave Stevenson wrote:
> Hi Pekka
> 
> On Fri, 18 Nov 2022 at 10:15, Pekka Paalanen  wrote:
>>
>> On Thu, 17 Nov 2022 22:13:26 +0100
>> Sebastian Wick  wrote:
>>
>>> Hi Dave,
>>>
>>> I noticed that I didn't get the Broadcast RGB property thanks to you
>>> (more below)
>>>
>>> On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
>>>  wrote:

 Hi Sebastian

 Thanks for starting the conversation - it's stalled a number of times
 previously.

 On Mon, 14 Nov 2022 at 23:12, Sebastian Wick  
 wrote:
>
> There are still regular bug reports about monitors (sinks) and sources
> disagreeing about the quantization range of the pixel data. In
> particular sources sending full range data when the sink expects
> limited range. From a user space perspective, this is all hidden in
> the kernel. We send full range data to the kernel and then hope it
> does the right thing but as the bug reports show: some combinations of
> displays and drivers result in problems.

 I'll agree that we as Raspberry Pi also get a number of bug reports
 where sinks don't always look at the infoframes and misinterpret the
 data.

> In general the whole handling of the quantization range on linux is
> not defined or documented at all. User space sends full range data
> because that's what seems to work most of the time but technically
> this is all undefined and user space can not fix those issues. Some
> compositors have resorted to giving users the option to choose the
> quantization range but this really should only be necessary for
> straight up broken hardware.

 Wowsers! Making userspace worry about limited range data would be a
 very weird decision in my view, so compositors should always deal in
 full range data.
>>>
>>> Making this a user space problem is IMO the ideal way to deal with it
>>> but that's a bit harder to do (I'll answer that in the reply to
>>> Pekka). So let's just assume we all agree that user space only deals
>>> with full range data.
>>
>> Limited range was invented for some reason, so it must have some use
>> somewhere, at least in the past. Maybe it was needed to calibrate mixed
>> digital/analog video processing chains with test images that needed to
>> contain sub-blacks and super-whites, to make sure that sub-blacks come
>> out as the nominal black etc. Just because desktop computers do not
>> seem to have any need for limited range, I personally wouldn't be as
>> arrogant as to say it's never useful. Maybe there are professional
>> video/broadcasting needs that currently can only be realized with
>> proprietary OS/hardware, because Linux just can't do it today?
>>
>> Why would TVs support limited range, if it was never useful? Why would
>> video sources produce limited range if it was always strictly inferior
>> to full range?
>>
>> Even digital image processing algorithms might make use of
>> out-of-unit-range values, not just analog circuitry for overshoot.
>>
>> But no, I can't give a real example, just speculation. Hence it's fine
>> by me to discard limited range processing for now. Still, what I
>> explain below would allow limited range processing without any extra
>> complexity by making the KMS color pipeline better defined and less
>> limiting for userspace.
> 
> AIUI limited range comes from the analogue world, or possibly creative
> (film/TV) world, hence being used on Consumer devices rather than IT
> ones (CTA and CEA modes vs VESA and DMT modes).
> 
> YCbCr output from video codecs typically uses a range of 16-235,
> therefore a media player wanting to pass the decoded video out to the
> display exactly as-is needs to be able to signal that to the display
> for it to be interpreted correctly.
> HDMI extended DVI. I believe both YCbCr support and range control were
> added to the HDMI spec at the same time, presumably to allow for this
> use case. Limited range RGB seems to be a bit of a quirk though.
> 
> Just to be annoying, JPEG uses full range YCbCr.
> 
 How would composition of multiple DRM planes work if some are limited
 range and some are full but you want limited range output? Your
 hardware needs to have CSC matrices to convert full range down to
 limited range, and know that you want to use them to effectively
 compose to limited range.
 In fact you can't currently tell DRM that an RGB plane is limited
 range - the values in enum drm_color_range are
 DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].
>>
>> Yeah, that's because range conversion has been conflated with
>> YUV-to-RGB conversion, and the result is always full-range RGB in
>> practise, AFAIU. There is no way to feed limited range color into the
>> further color pipeline in KMS, but that's actually a good thing.(*)
>>
>> The following is my opinion of the future, as someone who has been
>> thinking about how to make HDR work on Wayland whil

Re: The state of Quantization Range handling

2022-11-18 Thread Dave Stevenson
Hi Pekka

On Fri, 18 Nov 2022 at 10:15, Pekka Paalanen  wrote:
>
> On Thu, 17 Nov 2022 22:13:26 +0100
> Sebastian Wick  wrote:
>
> > Hi Dave,
> >
> > I noticed that I didn't get the Broadcast RGB property thanks to you
> > (more below)
> >
> > On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
> >  wrote:
> > >
> > > Hi Sebastian
> > >
> > > Thanks for starting the conversation - it's stalled a number of times
> > > previously.
> > >
> > > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick  
> > > wrote:
> > > >
> > > > There are still regular bug reports about monitors (sinks) and sources
> > > > disagreeing about the quantization range of the pixel data. In
> > > > particular sources sending full range data when the sink expects
> > > > limited range. From a user space perspective, this is all hidden in
> > > > the kernel. We send full range data to the kernel and then hope it
> > > > does the right thing but as the bug reports show: some combinations of
> > > > displays and drivers result in problems.
> > >
> > > I'll agree that we as Raspberry Pi also get a number of bug reports
> > > where sinks don't always look at the infoframes and misinterpret the
> > > data.
> > >
> > > > In general the whole handling of the quantization range on linux is
> > > > not defined or documented at all. User space sends full range data
> > > > because that's what seems to work most of the time but technically
> > > > this is all undefined and user space can not fix those issues. Some
> > > > compositors have resorted to giving users the option to choose the
> > > > quantization range but this really should only be necessary for
> > > > straight up broken hardware.
> > >
> > > Wowsers! Making userspace worry about limited range data would be a
> > > very weird decision in my view, so compositors should always deal in
> > > full range data.
> >
> > Making this a user space problem is IMO the ideal way to deal with it
> > but that's a bit harder to do (I'll answer that in the reply to
> > Pekka). So let's just assume we all agree that user space only deals
> > with full range data.
>
> Limited range was invented for some reason, so it must have some use
> somewhere, at least in the past. Maybe it was needed to calibrate mixed
> digital/analog video processing chains with test images that needed to
> contain sub-blacks and super-whites, to make sure that sub-blacks come
> out as the nominal black etc. Just because desktop computers do not
> seem to have any need for limited range, I personally wouldn't be as
> arrogant as to say it's never useful. Maybe there are professional
> video/broadcasting needs that currently can only be realized with
> proprietary OS/hardware, because Linux just can't do it today?
>
> Why would TVs support limited range, if it was never useful? Why would
> video sources produce limited range if it was always strictly inferior
> to full range?
>
> Even digital image processing algorithms might make use of
> out-of-unit-range values, not just analog circuitry for overshoot.
>
> But no, I can't give a real example, just speculation. Hence it's fine
> by me to discard limited range processing for now. Still, what I
> explain below would allow limited range processing without any extra
> complexity by making the KMS color pipeline better defined and less
> limiting for userspace.

AIUI limited range comes from the analogue world, or possibly creative
(film/TV) world, hence being used on Consumer devices rather than IT
ones (CTA and CEA modes vs VESA and DMT modes).

YCbCr output from video codecs typically uses a range of 16-235,
therefore a media player wanting to pass the decoded video out to the
display exactly as-is needs to be able to signal that to the display
for it to be interpreted correctly.
HDMI extended DVI. I believe both YCbCr support and range control were
added to the HDMI spec at the same time, presumably to allow for this
use case. Limited range RGB seems to be a bit of a quirk though.

Just to be annoying, JPEG uses full range YCbCr.

> > > How would composition of multiple DRM planes work if some are limited
> > > range and some are full but you want limited range output? Your
> > > hardware needs to have CSC matrices to convert full range down to
> > > limited range, and know that you want to use them to effectively
> > > compose to limited range.
> > > In fact you can't currently tell DRM that an RGB plane is limited
> > > range - the values in enum drm_color_range are
> > > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].
>
> Yeah, that's because range conversion has been conflated with
> YUV-to-RGB conversion, and the result is always full-range RGB in
> practise, AFAIU. There is no way to feed limited range color into the
> further color pipeline in KMS, but that's actually a good thing.(*)
>
> The following is my opinion of the future, as someone who has been
> thinking about how to make HDR work on Wayland while allowing the
> display quality and hardware optimizations 

Re: The state of Quantization Range handling

2022-11-18 Thread Pekka Paalanen
On Thu, 17 Nov 2022 22:13:26 +0100
Sebastian Wick  wrote:

> Hi Dave,
> 
> I noticed that I didn't get the Broadcast RGB property thanks to you
> (more below)
> 
> On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
>  wrote:
> >
> > Hi Sebastian
> >
> > Thanks for starting the conversation - it's stalled a number of times
> > previously.
> >
> > On Mon, 14 Nov 2022 at 23:12, Sebastian Wick  
> > wrote:  
> > >
> > > There are still regular bug reports about monitors (sinks) and sources
> > > disagreeing about the quantization range of the pixel data. In
> > > particular sources sending full range data when the sink expects
> > > limited range. From a user space perspective, this is all hidden in
> > > the kernel. We send full range data to the kernel and then hope it
> > > does the right thing but as the bug reports show: some combinations of
> > > displays and drivers result in problems.  
> >
> > I'll agree that we as Raspberry Pi also get a number of bug reports
> > where sinks don't always look at the infoframes and misinterpret the
> > data.
> >  
> > > In general the whole handling of the quantization range on linux is
> > > not defined or documented at all. User space sends full range data
> > > because that's what seems to work most of the time but technically
> > > this is all undefined and user space can not fix those issues. Some
> > > compositors have resorted to giving users the option to choose the
> > > quantization range but this really should only be necessary for
> > > straight up broken hardware.  
> >
> > Wowsers! Making userspace worry about limited range data would be a
> > very weird decision in my view, so compositors should always deal in
> > full range data.  
> 
> Making this a user space problem is IMO the ideal way to deal with it
> but that's a bit harder to do (I'll answer that in the reply to
> Pekka). So let's just assume we all agree that user space only deals
> with full range data.

Limited range was invented for some reason, so it must have some use
somewhere, at least in the past. Maybe it was needed to calibrate mixed
digital/analog video processing chains with test images that needed to
contain sub-blacks and super-whites, to make sure that sub-blacks come
out as the nominal black etc. Just because desktop computers do not
seem to have any need for limited range, I personally wouldn't be as
arrogant as to say it's never useful. Maybe there are professional
video/broadcasting needs that currently can only be realized with
proprietary OS/hardware, because Linux just can't do it today?

Why would TVs support limited range, if it was never useful? Why would
video sources produce limited range if it was always strictly inferior
to full range?

Even digital image processing algorithms might make use of
out-of-unit-range values, not just analog circuitry for overshoot.

But no, I can't give a real example, just speculation. Hence it's fine
by me to discard limited range processing for now. Still, what I
explain below would allow limited range processing without any extra
complexity by making the KMS color pipeline better defined and less
limiting for userspace.

> > How would composition of multiple DRM planes work if some are limited
> > range and some are full but you want limited range output? Your
> > hardware needs to have CSC matrices to convert full range down to
> > limited range, and know that you want to use them to effectively
> > compose to limited range.
> > In fact you can't currently tell DRM that an RGB plane is limited
> > range - the values in enum drm_color_range are
> > DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].

Yeah, that's because range conversion has been conflated with
YUV-to-RGB conversion, and the result is always full-range RGB in
practise, AFAIU. There is no way to feed limited range color into the
further color pipeline in KMS, but that's actually a good thing.(*)

The following is my opinion of the future, as someone who has been
thinking about how to make HDR work on Wayland while allowing the
display quality and hardware optimizations that Wayland was designed
for:


Userspace should not tell KMS about a plane being limited range at all.
The reason is the same why userspace should not tell KMS about what
colorspace a plane is in.

Instead, userspace wants to program specific mathematical operations
into KMS hardware without any associated or implied semantics. It's
just math. The actual semantics have been worked out by userspace
before-hand. This allows to use the KMS hardware to its fullest effect,
even for things the hardware or KMS UAPI designers did not anticipate.

IMO, framebuffers and KMS planes should ultimately be in undefined
quantization range, undefined color space, and undefined dynamic range.
The correct processing of the pixel values is programmed by per-plane
KMS properties like CTM, LUT, and more specialized components like
quantization range converter or YUV-to-RGB converter (which is just
another CTM at

Re: The state of Quantization Range handling

2022-11-18 Thread Pekka Paalanen
On Thu, 17 Nov 2022 22:39:36 +0100
Sebastian Wick  wrote:

> On Wed, Nov 16, 2022 at 1:34 PM Pekka Paalanen  wrote:
> >
> > Is it a good idea to put even more automation/magic into configuring
> > the color pipeline and metadata for a sink, making them even more
> > intertwined?
> >
> > I would prefer the opposite direction, making thing more explicit and
> > orthogonal.  
> 
> In general I completely agree with this, I just don't think it's
> feasible with the current state of KMS. For the color pipeline API [1]
> that's exactly the behavior I want but it should be guarded behind a
> DRM cap.
> 
> For that new API, user space needs direct control over the
> quantization range infoframe and the kernel has to somehow tell user
> space what quantization range the sink expects for the default
> behavior. User space then programs the infoframe when possible and
> builds the color pipeline in such a way that the output is whatever
> the sink expects.
> 
> The issue really is that if we push this all to user space it would be
> a backwards incompatible change. So let's fix the current situation in
> a backwards compatible way and then get it right for the new API that
> user space can opt-into.
> 
> Does that make sense?

It makes sense as far as userspace does not need to be changed to make
use of this.

But if userspace will need changes regardless, why continue on a
dead-end API? One reason could be that a new explicit API is too much
work compared to when you want your issues fixed.

If you are introducing a new KMS property (the override control), then
by definition userspace needs changes to use it.

[1] OTOH is a re-design the world -approach, which is am not suggesting
when I talk about making this explicit. I'm thinking about a much
smaller step that concerns only quantization range handling inside the
existing color pipeline "framework". E.g. deprecate "Broadcast RGB"
property and add "quantization range conversion" property that is
orthogonal to another new property for the quantization range metadata
sent to a sink.


Thanks,
pq

> > > Appendix A: Broadcast RGB property
> > >
> > > A few drivers already implement the Broadcast RGB property to control
> > > the quantization range. However, it is pointless: It can be set to
> > > Auto, Full and Limited when the sink supports explicitly setting the
> > > quantization range. The driver expects full range content and converts
> > > it to limited range content when necessary. Selecting limited range
> > > never makes any sense: the out-of-range bits can't be used because the
> > > input is full range. Selecting Default never makes sense: relying on
> > > the default quantization range is risky because sinks often get it
> > > wrong and as we established there is no reason to select limited range
> > > if not necessary. The limited and full options also are not suitable
> > > as an override because the property is not available if the sink does
> > > not support explicitly setting the quantization range.
> > >  
> >  
> 
> [1] https://gitlab.freedesktop.org/pq/color-and-hdr/-/issues/11
> 



pgp_4qRWwa7Kt.pgp
Description: OpenPGP digital signature


Re: The state of Quantization Range handling

2022-11-17 Thread Sebastian Wick
On Wed, Nov 16, 2022 at 1:34 PM Pekka Paalanen  wrote:
>
> On Tue, 15 Nov 2022 00:11:56 +0100
> Sebastian Wick  wrote:
>
> > There are still regular bug reports about monitors (sinks) and sources
> > disagreeing about the quantization range of the pixel data. In
> > particular sources sending full range data when the sink expects
> > limited range. From a user space perspective, this is all hidden in
> > the kernel. We send full range data to the kernel and then hope it
> > does the right thing but as the bug reports show: some combinations of
> > displays and drivers result in problems.
> >
> > In general the whole handling of the quantization range on linux is
> > not defined or documented at all. User space sends full range data
> > because that's what seems to work most of the time but technically
> > this is all undefined and user space can not fix those issues. Some
> > compositors have resorted to giving users the option to choose the
> > quantization range but this really should only be necessary for
> > straight up broken hardware.
> >
> > Quantization Range can be explicitly controlled by AVI InfoFrame or
> > HDMI General Control Packets. This is the ideal case and when the
> > source uses them there is not a lot that can go wrong. Not all
> > displays support those explicit controls in which case the chosen
> > video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> > quantization range the sink expects.
> >
> > This means that we have to expect that sometimes we have to send
> > limited and sometimes full range content. The big question however
> > that is not answered in the docs: who is responsible for making sure
> > the data is in the correct range? Is it the kernel or user space?
> >
> > If it's the kernel: does user space supply full range or limited range
> > content? Each of those has a disadvantage. If we send full range
> > content and the driver scales it down to limited range, we can't use
> > the out-of-range bits to transfer information. If we send limited
> > range content and the driver scales it up we lose information.
> >
> > Either way, this must be documented. My suggestion is to say that the
> > kernel always expects full range data as input and is responsible for
> > scaling it to limited range data if the sink expects limited range
> > data.
>
> Hi Sebastian,
>
> you are proposing the that driver/hardware will do either no range
> conversion, or full-to-limited range conversion. Limited-to-full range
> conversion would never be supported.
>
> I still wonder if limited-to-full range conversion could be useful with
> video content.
>
> > Another problem is that some displays do not behave correctly. It must
> > be possible to override the kernel when the user detects such a
> > situation. This override then controls if the driver converts the full
> > range data coming from the client or not (Default, Force Limited,
> > Force Full). It does not try to control what range the sink expects.
> > Let's call this the Quantization Range Override property which should
> > be implemented by all drivers.
>
> In other words, a CRTC "quantization range conversion" property with
> values:
> - auto, with the assumption that color pipeline always produces full-range
> - identity
> - full-to-limited
> (- limited-to-full)
>
> If this property was truly independent of the metadata being sent to
> the sink, and of the framebuffer format, it would allow us to do four
> ways: both full/limited framebuffer on both full/limited sink. It would
> allow us to send sub-blacks and super-whites as well.
>
> More precisely, framebuffers would always have *undefined* quantization
> range. The configuration of the color pipeline then determines how that
> data is manipulated into a video signal.
>
> So I am advocating the same design as with color spaces: do not tell
> KMS what your colorspaces are. Instead tell KMS what operations it
> needs to do with the pixel data, and what metadata to send to the sink.
>
> > All drivers should make sure their behavior is correct:
> >
> > * check that drivers choose the correct default quantization range for
> > the selected mode
>
> Mode implying a quantization range is awkward, but maybe the kernel
> established modes should just have a flag for it. Then userspace would
> know. Unless the video mode system is extended to communicate
> IT/CE/SD/VIC and whatnot to userspace, making the modes better defined.
> Then userspace would know too.
>
> > * whenever explicit control is available, use it and set the
> > quantization range to full
> > * make sure that the hardware converts from full range to limited
> > range whenever the sink expects limited range
> > * implement the Quantization Range Override property
> >
> > I'm volunteering for the documentation, UAPI and maybe even the drm
> > core parts if there is willingness to tackle the issue.
>
> Is it a good idea to put even more automation/magic into configuring
> the color pipeline and metadata for a

Re: The state of Quantization Range handling

2022-11-17 Thread Sebastian Wick
Hi Yussuf,

On Tue, Nov 15, 2022 at 5:26 PM Yussuf Khalil  wrote:
>
> Hello Sebastian,
>
> I've previously done some work on this topic [1]. My efforts were mostly about
> fixing the situation regarding overrides and providing proper means for
> userspace. I am affected by the issue myself as I own several DELL U2414H
> screens that declare a CE mode as their preferred one, but should receive full
> range data nonetheless. They do not have the respective bit set in their EDID
> to indicate full range support either.
>
> My implementation primarily moved the "Broadcast RGB" to DRM core and re-wired
> it in i915 and gma500. I further added a flag to indicate CE modes to 
> userspace
> so that apps such as gnome-control-center can clearly communicate whether full
> or limited range would be used by default. A v2 branch that I never submitted
> is available at [2]. I also have some code lying around locally that adds the
> required functionality to mutter and gnome-control-center.

Yeah, I now agree that moving the "Broadcast RGB" to DRM core would be
a good decision. The slight behavior change I want to see can be done
afterwards as well. Not so sure about indicating CE modes because
there are other factors (YCC vs RGB, the connector type and version)
which influence the default quantization range.

>
> I had to pause work on the issue back then and never really came around to
> picking it up again, however, I would be interested in working on it again if
> there is consensus on the direction that my patches laid out. I did not
> consider use cases for the out-of-range bits though.

I think we can safely ignore out-of-range bits for now and good to
know you're on board.

>
> Regards
> Yussuf
>
> [1] 
> https://patchwork.kernel.org/project/dri-devel/cover/20200413214024.46500-1-...@pp3345.net/
> [2] https://github.com/pp3345/linux/commits/rgb-quant-range-v2
>
> On 15.11.22 00:11, Sebastian Wick wrote:
> > There are still regular bug reports about monitors (sinks) and sources
> > disagreeing about the quantization range of the pixel data. In
> > particular sources sending full range data when the sink expects
> > limited range. From a user space perspective, this is all hidden in
> > the kernel. We send full range data to the kernel and then hope it
> > does the right thing but as the bug reports show: some combinations of
> > displays and drivers result in problems.
> >
> > In general the whole handling of the quantization range on linux is
> > not defined or documented at all. User space sends full range data
> > because that's what seems to work most of the time but technically
> > this is all undefined and user space can not fix those issues. Some
> > compositors have resorted to giving users the option to choose the
> > quantization range but this really should only be necessary for
> > straight up broken hardware.
> >
> > Quantization Range can be explicitly controlled by AVI InfoFrame or
> > HDMI General Control Packets. This is the ideal case and when the
> > source uses them there is not a lot that can go wrong. Not all
> > displays support those explicit controls in which case the chosen
> > video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> > quantization range the sink expects.
> >
> > This means that we have to expect that sometimes we have to send
> > limited and sometimes full range content. The big question however
> > that is not answered in the docs: who is responsible for making sure
> > the data is in the correct range? Is it the kernel or user space?
> >
> > If it's the kernel: does user space supply full range or limited range
> > content? Each of those has a disadvantage. If we send full range
> > content and the driver scales it down to limited range, we can't use
> > the out-of-range bits to transfer information. If we send limited
> > range content and the driver scales it up we lose information.
> >
> > Either way, this must be documented. My suggestion is to say that the
> > kernel always expects full range data as input and is responsible for
> > scaling it to limited range data if the sink expects limited range
> > data.
> >
> > Another problem is that some displays do not behave correctly. It must
> > be possible to override the kernel when the user detects such a
> > situation. This override then controls if the driver converts the full
> > range data coming from the client or not (Default, Force Limited,
> > Force Full). It does not try to control what range the sink expects.
> > Let's call this the Quantization Range Override property which should
> > be implemented by all drivers.
> >
> > All drivers should make sure their behavior is correct:
> >
> > * check that drivers choose the correct default quantization range for
> > the selected mode
> > * whenever explicit control is available, use it and set the
> > quantization range to full
> > * make sure that the hardware converts from full range to limited
> > range whenever the sink expects limited range
> 

Re: The state of Quantization Range handling

2022-11-17 Thread Sebastian Wick
Hi Dave,

I noticed that I didn't get the Broadcast RGB property thanks to you
(more below)

On Tue, Nov 15, 2022 at 2:16 PM Dave Stevenson
 wrote:
>
> Hi Sebastian
>
> Thanks for starting the conversation - it's stalled a number of times
> previously.
>
> On Mon, 14 Nov 2022 at 23:12, Sebastian Wick  
> wrote:
> >
> > There are still regular bug reports about monitors (sinks) and sources
> > disagreeing about the quantization range of the pixel data. In
> > particular sources sending full range data when the sink expects
> > limited range. From a user space perspective, this is all hidden in
> > the kernel. We send full range data to the kernel and then hope it
> > does the right thing but as the bug reports show: some combinations of
> > displays and drivers result in problems.
>
> I'll agree that we as Raspberry Pi also get a number of bug reports
> where sinks don't always look at the infoframes and misinterpret the
> data.
>
> > In general the whole handling of the quantization range on linux is
> > not defined or documented at all. User space sends full range data
> > because that's what seems to work most of the time but technically
> > this is all undefined and user space can not fix those issues. Some
> > compositors have resorted to giving users the option to choose the
> > quantization range but this really should only be necessary for
> > straight up broken hardware.
>
> Wowsers! Making userspace worry about limited range data would be a
> very weird decision in my view, so compositors should always deal in
> full range data.

Making this a user space problem is IMO the ideal way to deal with it
but that's a bit harder to do (I'll answer that in the reply to
Pekka). So let's just assume we all agree that user space only deals
with full range data.

> How would composition of multiple DRM planes work if some are limited
> range and some are full but you want limited range output? Your
> hardware needs to have CSC matrices to convert full range down to
> limited range, and know that you want to use them to effectively
> compose to limited range.
> In fact you can't currently tell DRM that an RGB plane is limited
> range - the values in enum drm_color_range are
> DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].
>
> > Quantization Range can be explicitly controlled by AVI InfoFrame or
> > HDMI General Control Packets. This is the ideal case and when the
> > source uses them there is not a lot that can go wrong. Not all
> > displays support those explicit controls in which case the chosen
> > video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> > quantization range the sink expects.
> >
> > This means that we have to expect that sometimes we have to send
> > limited and sometimes full range content. The big question however
> > that is not answered in the docs: who is responsible for making sure
> > the data is in the correct range? Is it the kernel or user space?
> >
> > If it's the kernel: does user space supply full range or limited range
> > content? Each of those has a disadvantage. If we send full range
> > content and the driver scales it down to limited range, we can't use
> > the out-of-range bits to transfer information. If we send limited
> > range content and the driver scales it up we lose information.
>
> How often have you encountered the out-of-range bits being used?
> Personally I've never come across it. Is it really that common?
> If trying to pass non-video data from the client then you need to pray
> there is no scaling or filtering during composition as it could
> legitimately be corrupted.

All true, and personally I've also never encountered this which is why
I'd like to ignore all of that for now.

>
> > Either way, this must be documented. My suggestion is to say that the
> > kernel always expects full range data as input and is responsible for
> > scaling it to limited range data if the sink expects limited range
> > data.
>
> AIUI That is the current situation. It certainly fits the way that all
> our hardware works.
>
> > Another problem is that some displays do not behave correctly. It must
> > be possible to override the kernel when the user detects such a
> > situation. This override then controls if the driver converts the full
> > range data coming from the client or not (Default, Force Limited,
> > Force Full). It does not try to control what range the sink expects.
>
> Sorry, I'm not clear from the description. Is this a plane, crtc, or
> connector property?
>
> "Data coming from the client" would imply a plane property only -
> effectively extending enum drm_color_range for RGB formats.
>
> If it is a connector property then what do you mean by not controlling
> the range? It doesn't change the AVI Infoframe or GCP and leaves the
> sink thinking it is the default? If so, doesn't that mean this control
> can now make a compliant sink incorrectly render the data? Assuming
> the driver is using drm_hdmi_avi_infoframe_quant_range [2

Re: The state of Quantization Range handling

2022-11-16 Thread Pekka Paalanen
On Tue, 15 Nov 2022 00:11:56 +0100
Sebastian Wick  wrote:

> There are still regular bug reports about monitors (sinks) and sources
> disagreeing about the quantization range of the pixel data. In
> particular sources sending full range data when the sink expects
> limited range. From a user space perspective, this is all hidden in
> the kernel. We send full range data to the kernel and then hope it
> does the right thing but as the bug reports show: some combinations of
> displays and drivers result in problems.
> 
> In general the whole handling of the quantization range on linux is
> not defined or documented at all. User space sends full range data
> because that's what seems to work most of the time but technically
> this is all undefined and user space can not fix those issues. Some
> compositors have resorted to giving users the option to choose the
> quantization range but this really should only be necessary for
> straight up broken hardware.
> 
> Quantization Range can be explicitly controlled by AVI InfoFrame or
> HDMI General Control Packets. This is the ideal case and when the
> source uses them there is not a lot that can go wrong. Not all
> displays support those explicit controls in which case the chosen
> video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> quantization range the sink expects.
> 
> This means that we have to expect that sometimes we have to send
> limited and sometimes full range content. The big question however
> that is not answered in the docs: who is responsible for making sure
> the data is in the correct range? Is it the kernel or user space?
> 
> If it's the kernel: does user space supply full range or limited range
> content? Each of those has a disadvantage. If we send full range
> content and the driver scales it down to limited range, we can't use
> the out-of-range bits to transfer information. If we send limited
> range content and the driver scales it up we lose information.
> 
> Either way, this must be documented. My suggestion is to say that the
> kernel always expects full range data as input and is responsible for
> scaling it to limited range data if the sink expects limited range
> data.

Hi Sebastian,

you are proposing the that driver/hardware will do either no range
conversion, or full-to-limited range conversion. Limited-to-full range
conversion would never be supported.

I still wonder if limited-to-full range conversion could be useful with
video content.

> Another problem is that some displays do not behave correctly. It must
> be possible to override the kernel when the user detects such a
> situation. This override then controls if the driver converts the full
> range data coming from the client or not (Default, Force Limited,
> Force Full). It does not try to control what range the sink expects.
> Let's call this the Quantization Range Override property which should
> be implemented by all drivers.

In other words, a CRTC "quantization range conversion" property with
values:
- auto, with the assumption that color pipeline always produces full-range
- identity
- full-to-limited
(- limited-to-full)

If this property was truly independent of the metadata being sent to
the sink, and of the framebuffer format, it would allow us to do four
ways: both full/limited framebuffer on both full/limited sink. It would
allow us to send sub-blacks and super-whites as well.

More precisely, framebuffers would always have *undefined* quantization
range. The configuration of the color pipeline then determines how that
data is manipulated into a video signal.

So I am advocating the same design as with color spaces: do not tell
KMS what your colorspaces are. Instead tell KMS what operations it
needs to do with the pixel data, and what metadata to send to the sink.

> All drivers should make sure their behavior is correct:
> 
> * check that drivers choose the correct default quantization range for
> the selected mode

Mode implying a quantization range is awkward, but maybe the kernel
established modes should just have a flag for it. Then userspace would
know. Unless the video mode system is extended to communicate
IT/CE/SD/VIC and whatnot to userspace, making the modes better defined.
Then userspace would know too.

> * whenever explicit control is available, use it and set the
> quantization range to full
> * make sure that the hardware converts from full range to limited
> range whenever the sink expects limited range
> * implement the Quantization Range Override property
> 
> I'm volunteering for the documentation, UAPI and maybe even the drm
> core parts if there is willingness to tackle the issue.

Is it a good idea to put even more automation/magic into configuring
the color pipeline and metadata for a sink, making them even more
intertwined?

I would prefer the opposite direction, making thing more explicit and
orthogonal.


Thanks,
pq

> Appendix A: Broadcast RGB property
> 
> A few drivers already implement the Broadcast RGB property

Re: The state of Quantization Range handling

2022-11-15 Thread Yussuf Khalil
Hello Sebastian,

I've previously done some work on this topic [1]. My efforts were mostly about 
fixing the situation regarding overrides and providing proper means for 
userspace. I am affected by the issue myself as I own several DELL U2414H 
screens that declare a CE mode as their preferred one, but should receive full 
range data nonetheless. They do not have the respective bit set in their EDID 
to indicate full range support either.

My implementation primarily moved the "Broadcast RGB" to DRM core and re-wired 
it in i915 and gma500. I further added a flag to indicate CE modes to userspace 
so that apps such as gnome-control-center can clearly communicate whether full 
or limited range would be used by default. A v2 branch that I never submitted 
is available at [2]. I also have some code lying around locally that adds the 
required functionality to mutter and gnome-control-center.

I had to pause work on the issue back then and never really came around to 
picking it up again, however, I would be interested in working on it again if 
there is consensus on the direction that my patches laid out. I did not 
consider use cases for the out-of-range bits though.

Regards
Yussuf

[1] 
https://patchwork.kernel.org/project/dri-devel/cover/20200413214024.46500-1-...@pp3345.net/
[2] https://github.com/pp3345/linux/commits/rgb-quant-range-v2

On 15.11.22 00:11, Sebastian Wick wrote:
> There are still regular bug reports about monitors (sinks) and sources
> disagreeing about the quantization range of the pixel data. In
> particular sources sending full range data when the sink expects
> limited range. From a user space perspective, this is all hidden in
> the kernel. We send full range data to the kernel and then hope it
> does the right thing but as the bug reports show: some combinations of
> displays and drivers result in problems.
> 
> In general the whole handling of the quantization range on linux is
> not defined or documented at all. User space sends full range data
> because that's what seems to work most of the time but technically
> this is all undefined and user space can not fix those issues. Some
> compositors have resorted to giving users the option to choose the
> quantization range but this really should only be necessary for
> straight up broken hardware.
> 
> Quantization Range can be explicitly controlled by AVI InfoFrame or
> HDMI General Control Packets. This is the ideal case and when the
> source uses them there is not a lot that can go wrong. Not all
> displays support those explicit controls in which case the chosen
> video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> quantization range the sink expects.
> 
> This means that we have to expect that sometimes we have to send
> limited and sometimes full range content. The big question however
> that is not answered in the docs: who is responsible for making sure
> the data is in the correct range? Is it the kernel or user space?
> 
> If it's the kernel: does user space supply full range or limited range
> content? Each of those has a disadvantage. If we send full range
> content and the driver scales it down to limited range, we can't use
> the out-of-range bits to transfer information. If we send limited
> range content and the driver scales it up we lose information.
> 
> Either way, this must be documented. My suggestion is to say that the
> kernel always expects full range data as input and is responsible for
> scaling it to limited range data if the sink expects limited range
> data.
> 
> Another problem is that some displays do not behave correctly. It must
> be possible to override the kernel when the user detects such a
> situation. This override then controls if the driver converts the full
> range data coming from the client or not (Default, Force Limited,
> Force Full). It does not try to control what range the sink expects.
> Let's call this the Quantization Range Override property which should
> be implemented by all drivers.
> 
> All drivers should make sure their behavior is correct:
> 
> * check that drivers choose the correct default quantization range for
> the selected mode
> * whenever explicit control is available, use it and set the
> quantization range to full
> * make sure that the hardware converts from full range to limited
> range whenever the sink expects limited range
> * implement the Quantization Range Override property
> 
> I'm volunteering for the documentation, UAPI and maybe even the drm
> core parts if there is willingness to tackle the issue.
> 
> Appendix A: Broadcast RGB property
> 
> A few drivers already implement the Broadcast RGB property to control
> the quantization range. However, it is pointless: It can be set to
> Auto, Full and Limited when the sink supports explicitly setting the
> quantization range. The driver expects full range content and converts
> it to limited range content when necessary. Selecting limited range
> never makes any sense: the out-of-range bits 

Re: The state of Quantization Range handling

2022-11-15 Thread Dave Stevenson
Hi Sebastian

Thanks for starting the conversation - it's stalled a number of times
previously.

On Mon, 14 Nov 2022 at 23:12, Sebastian Wick  wrote:
>
> There are still regular bug reports about monitors (sinks) and sources
> disagreeing about the quantization range of the pixel data. In
> particular sources sending full range data when the sink expects
> limited range. From a user space perspective, this is all hidden in
> the kernel. We send full range data to the kernel and then hope it
> does the right thing but as the bug reports show: some combinations of
> displays and drivers result in problems.

I'll agree that we as Raspberry Pi also get a number of bug reports
where sinks don't always look at the infoframes and misinterpret the
data.

> In general the whole handling of the quantization range on linux is
> not defined or documented at all. User space sends full range data
> because that's what seems to work most of the time but technically
> this is all undefined and user space can not fix those issues. Some
> compositors have resorted to giving users the option to choose the
> quantization range but this really should only be necessary for
> straight up broken hardware.

Wowsers! Making userspace worry about limited range data would be a
very weird decision in my view, so compositors should always deal in
full range data.

How would composition of multiple DRM planes work if some are limited
range and some are full but you want limited range output? Your
hardware needs to have CSC matrices to convert full range down to
limited range, and know that you want to use them to effectively
compose to limited range.
In fact you can't currently tell DRM that an RGB plane is limited
range - the values in enum drm_color_range are
DRM_COLOR_YCBCR_LIMITED_RANGE and DRM_COLOR_YCBCR_FULL_RANGE [1].

> Quantization Range can be explicitly controlled by AVI InfoFrame or
> HDMI General Control Packets. This is the ideal case and when the
> source uses them there is not a lot that can go wrong. Not all
> displays support those explicit controls in which case the chosen
> video format (IT, CE, SD; details in CTA-861-H 5.1) influences which
> quantization range the sink expects.
>
> This means that we have to expect that sometimes we have to send
> limited and sometimes full range content. The big question however
> that is not answered in the docs: who is responsible for making sure
> the data is in the correct range? Is it the kernel or user space?
>
> If it's the kernel: does user space supply full range or limited range
> content? Each of those has a disadvantage. If we send full range
> content and the driver scales it down to limited range, we can't use
> the out-of-range bits to transfer information. If we send limited
> range content and the driver scales it up we lose information.

How often have you encountered the out-of-range bits being used?
Personally I've never come across it. Is it really that common?
If trying to pass non-video data from the client then you need to pray
there is no scaling or filtering during composition as it could
legitimately be corrupted.

> Either way, this must be documented. My suggestion is to say that the
> kernel always expects full range data as input and is responsible for
> scaling it to limited range data if the sink expects limited range
> data.

AIUI That is the current situation. It certainly fits the way that all
our hardware works.

> Another problem is that some displays do not behave correctly. It must
> be possible to override the kernel when the user detects such a
> situation. This override then controls if the driver converts the full
> range data coming from the client or not (Default, Force Limited,
> Force Full). It does not try to control what range the sink expects.

Sorry, I'm not clear from the description. Is this a plane, crtc, or
connector property?

"Data coming from the client" would imply a plane property only -
effectively extending enum drm_color_range for RGB formats.

If it is a connector property then what do you mean by not controlling
the range? It doesn't change the AVI Infoframe or GCP and leaves the
sink thinking it is the default? If so, doesn't that mean this control
can now make a compliant sink incorrectly render the data? Assuming
the driver is using drm_hdmi_avi_infoframe_quant_range [2], then the
sink is likely to be told explicitly that it is one value which is
then actually wrong.


Or is this a flag to tell the crtc to compose all planes to a limited
range output, thereby updating all the CSC matrices used for RGB and
YCbCr planes? There's still no guarantee that the composition won't
clip the video to the specified output range thereby losing the out of
range values you carefully crafted.

> Let's call this the Quantization Range Override property which should
> be implemented by all drivers.
>
> All drivers should make sure their behavior is correct:
>
> * check that drivers choose the correct default quantization range fo