On Sun, 14 Apr 2019 12:57:47 +0200 Erwin Burema <e.bur...@gmail.com> wrote:
> Without a way to calibrate/profile screens an color management > protocol looses a lot of its value. So to add this missing feature I > wrote the following protocol. > > The idea is that the calibration/profiling SW only sets the RGB > triplet and then the compositor is responsible to draw a rectanglular > region on the selected output screen, since not all calibration tools > will be at the center of the screen a user should be able to modify > the placement of this rectanglular region. Unless specified the > monitor profile (if any) should not be applied but the GPU curve > should, currently to set a new curve the calibration tool should > generate a new ICC profile with the wanted curve in the VCGT tag (I > am not sure if this is the best option but would make the most > universal one). In the end after profiling the last uploaded ICC > could then be saved (although a compositor is not required to honor > the request in that case it should send the not saved error). If the > compositor doesn't save or the connection with this protocol is > broken the compositor should restore previous settings. Hi, I only took a very quick glance, but I do like where this design is going. I'll refrain from commenting on wl_surface vs. not for now though. Forgive me my ignorance, but why is the "GPU curve" needed to be a custom curve provided by the client? My naive thinking would assume that you would like to be able to address the pixel values on the display wire as directly as possible, which means a minimum of 12 or 14 bits per channel framebuffer format and an identity "GPU curve". Is the reason to use the "GPU curve" that you assume there is a 8 bits per channel framebuffer and you need to use the hardware LUT to choose which 8 bits wide range of the possibly 14 bits channel you want to address? (Currently a client cannot know if the framebuffer is 8 bits or less or more.) Your protocol proposal uses the pixel encoding red/green/blue as uint (32-bit) per channel. Would it be possible to have the compositor do the LUT manipulation if it needs to avoid the intermediate rounding caused by a 8 bit per channel framebuffer or color pipeline up to the final LUT? If such "GPU curve" manipulation is necessary, it essentially means nothing else can be shown on the output. Oh, could another reason to have the client control the "GPU curve" be that then the client can still show information on that output, since it can adjust the pixel contents to remain legible even while applying the manipulation. Is that used or desirable? Btw. how would a compositor know the bit depth of a monitor and the transport (wire)? I presume there should be some KMS properties for that in addition to connector types. Thanks, pq
pgprZ8g7NUhsW.pgp
Description: OpenPGP digital signature
_______________________________________________ wayland-devel mailing list wayland-devel@lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/wayland-devel