Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale
On Tue, 29 Jul 2014, Daniel Vetter wrote: > On Tue, Jul 29, 2014 at 06:14:16AM -0400, Anders Kaseorg wrote: >> On Tue, 29 Jul 2014, Hans de Goede wrote: >> > I've been thinking a bit about this, and I believe that the right answer >> > here is to do the linear to logarithmic mapping in user-space. The intel >> > backlight interface has a type of raw, clearly signalling to userspace >> > that it is a raw "untranslated" interface, as such any fanciness such as >> > creating a logarithmic scale should be done in userspace IMHO. >> >> I was going to respond that the kernel does its own brightness stepping >> when userspace isn’t paying attention. But apparently only acpi_video >> does that, and intel_backlight does not; my brightness keys now have no >> effect outside of the X server. Is that the expected behavior? > > Userspace on linux is supposed to catch brightness keys and update the > backlight. Some acpi drivers do funny stuff behind everyone's back, but > generally that's the expected behavior. You'd need a deamon for the > backlight to work on the console. > >> In any case, if you think punting part of the problem to userspace is the >> right answer, then to flesh out the details: do you think it’s right for >> userspace to assume that any backlight with type ‘raw’ is a linear scale >> that needs to be converted, and one with type ‘firmware’ or ‘platform’ has >> already been converted appropriately? > > I don't think you can generally assume anything - what we do is send the > pwm signal, how linearly that translates into brightness is totally panel > and driver dependent. So no matter what you pick someone will complain I > think. Because the mapping from PWM duty cycle to luminance is panel dependent, the ACPI opregion contains such mapping. Likely the ACPI backlight uses just that. We (i915) currently don't. I don't think the userspace has a sensible interface to that information. I'm not sure it should either. I haven't made up my mind on this, but I might go for doing the mapping in i915. Additionally I think we should probably use a fixed range of, say, 0-100 that gets exposed to the userspace; there's no point in exposing e.g. 10 levels when the hardware can not physically produce nor can the user distinguish that many distinct levels. I'd go for making this as simple as possible to use and implement right. Anything fancy is going to blow up in fantastic ways. BR, Jani. -- Jani Nikula, Intel Open Source Technology Center ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale
On Tue, Jul 29, 2014 at 06:14:16AM -0400, Anders Kaseorg wrote: > On Tue, 29 Jul 2014, Hans de Goede wrote: > > I've been thinking a bit about this, and I believe that the right answer > > here is to do the linear to logarithmic mapping in user-space. The intel > > backlight interface has a type of raw, clearly signalling to userspace > > that it is a raw "untranslated" interface, as such any fanciness such as > > creating a logarithmic scale should be done in userspace IMHO. > > I was going to respond that the kernel does its own brightness stepping > when userspace isn’t paying attention. But apparently only acpi_video > does that, and intel_backlight does not; my brightness keys now have no > effect outside of the X server. Is that the expected behavior? Userspace on linux is supposed to catch brightness keys and update the backlight. Some acpi drivers do funny stuff behind everyone's back, but generally that's the expected behavior. You'd need a deamon for the backlight to work on the console. > In any case, if you think punting part of the problem to userspace is the > right answer, then to flesh out the details: do you think it’s right for > userspace to assume that any backlight with type ‘raw’ is a linear scale > that needs to be converted, and one with type ‘firmware’ or ‘platform’ has > already been converted appropriately? I don't think you can generally assume anything - what we do is send the pwm signal, how linearly that translates into brightness is totally panel and driver dependent. So no matter what you pick someone will complain I think. -Daniel -- Daniel Vetter Software Engineer, Intel Corporation +41 (0) 79 365 57 48 - http://blog.ffwll.ch ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale
On Tue, 29 Jul 2014, Hans de Goede wrote: > I've been thinking a bit about this, and I believe that the right answer > here is to do the linear to logarithmic mapping in user-space. The intel > backlight interface has a type of raw, clearly signalling to userspace > that it is a raw "untranslated" interface, as such any fanciness such as > creating a logarithmic scale should be done in userspace IMHO. I was going to respond that the kernel does its own brightness stepping when userspace isn’t paying attention. But apparently only acpi_video does that, and intel_backlight does not; my brightness keys now have no effect outside of the X server. Is that the expected behavior? In any case, if you think punting part of the problem to userspace is the right answer, then to flesh out the details: do you think it’s right for userspace to assume that any backlight with type ‘raw’ is a linear scale that needs to be converted, and one with type ‘firmware’ or ‘platform’ has already been converted appropriately? Anders ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale
Hi, On 07/22/2014 06:32 AM, Anders Kaseorg wrote: > [1.] One line summary of the problem: > > Native backlight regressed from logarithmic to linear scale > > [2.] Full description of the problem/report: > > With the new default of video.use_native_backlight=0 (commit > v3.16-rc1~30^2~2^3), my Thinkpad T510 backlight has become very difficult > to control near the low end of the scale. The lowest setting now turns > the backlight completely off, but the second-lowest setting is too bright. > Meanwhile, the difference between the highest several settings is nearly > imperceptible. > > This happened because /sys/class/backlight/acpi_video0/brightness (which > has now disappeared) used a different scale than > /sys/class/intel_backlight/brightness; the relationship between > acpi_video0 and intel_backlight was not linear. I measured the exact > relationship as follows: > > acpiintel > 0 52 > 1 87 > 2 139 > 3 174 > 4 226 > 5 261 > 6 313 > 7 435 > 8 591 > 9 800 > 10 1078 > 11 1461 > 12 1914 > 13 2557 > 14 3358 > 15 4437 > > The relationship is close to logarithmic; a good fit is intel = > 60*(4/3)^acpi, or acpi = log_{4/3} (intel/60). It’s well known that > perceived brightness varies logarithmically with physical luminance > (Fechner’s law), so it’s no surprise that the acpi_video0 scale was more > useful. > > Since the kernel no longer uses ACPI to do this translation, it should do > the translation itself. I've been thinking a bit about this, and I believe that the right answer here is to do the linear to logarithmic mapping in user-space. The intel backlight interface has a type of raw, clearly signalling to userspace that it is a raw "untranslated" interface, as such any fanciness such as creating a logarithmic scale should be done in userspace IMHO. Regards, Hans ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx