On 4/10/2018 1:49 AM, Nikula, Jani wrote:
On Tue, 10 Apr 2018, Jani Nikula <jani.nik...@intel.com> wrote:
On Mon, 09 Apr 2018, "Kumar, Abhay" <abhay.ku...@intel.com> wrote:
Dynamic cdclk is disabled in BIOS/GOP hence gop makes it to highest
clock i.e 316.8. Will attach logs with drm debug enabled in bug.
I am also inclined towards 192 which will make max cdclk..
Please also attach /sys/kernel/debug/dri/0/i915_vbt to the bug.

It doesn't look like we look at the VBT dynamic cdclk field. Does
dynamic debug disabled mean we should go for max?
currently in kernel we don't honor this field. GOP does and by disabling it cdclk will run at max speed.
attached vbt dump in bug.

with patchset 1 i found one issue where while resuming HDA takes 4 second and also that time cdclk is 79.2
below logs shows which function takes more time.

   78.485868] Suspending console(s) (use no_console_suspend to debug)
[   78.521654] HDMI HDA Codec ehdaudio0D2: Enter: hdmi_codec_prepare
[   78.620851] HDMI HDA Codec ehdaudio0D2: Enter1: hdac_hdmi_runtime_resume
[   78.620856] HDMI HDA Codec ehdaudio0D2: Enter2: hdac_hdmi_runtime_resume
[   78.620863] HDMI HDA Codec ehdaudio0D2: Enter3: hdac_hdmi_runtime_resume
[   78.620878] HDMI HDA Codec ehdaudio0D2: Enter4: hdac_hdmi_runtime_resume
[   78.620886] cdclk_1 79200
[   78.624431] cdclk_1 79200
[   78.626222] HDMI HDA Codec ehdaudio0D2: Enter5: hdac_hdmi_runtime_resume
[   78.626226] HDMI HDA Codec ehdaudio0D2: Enter6: hdac_hdmi_runtime_resume
[   79.629307] HDMI HDA Codec ehdaudio0D2: Enter7: hdac_hdmi_runtime_resume
[   80.632308] HDMI HDA Codec ehdaudio0D2: Enter8: hdac_hdmi_runtime_resume
[   81.635303] HDMI HDA Codec ehdaudio0D2: Exit: hdac_hdmi_runtime_resume
[   82.638302] HDMI HDA Codec ehdaudio0D2: Exit: hdmi_codec_prepare
[   82.638348] calling  input11+ @ 2310, parent: card0
[   82.638353] call input11+ returned 0 after 1 usecs

but when i hardcode cdcdlk glk_calc_cdclk minimum 158.4 then there is no 4sec delay in these function.


Ville, I tried to look at how to disable dynamic cdclk for some
platforms based on the VBT, but it gets a bit hairy. The code seems
pretty wired for going to lowest possible. I've got the trivial VBT
parsing part, but plugging that into the cdclk code in a clean way that
could *also* be backported to stable is driving me nuts. Any ideas? I'd
like to fix the issue first, and (then have you ;) embark on the
refactoring afterwards.

It's trivial to plug the check into intel_crtc_compute_min_cdclk() and
return dev_priv->max_cdclk_freq, but a) I think that place should be
reserved for crtc_state related limitations, and seems the completely
wrong place to do system level things, b) it's not optimal to go through
all the cdclk code to do nothing in the end, and c) it doesn't work for
the no crtc's active case at init time.

Just setting the .set_cdclk and .modeset_calc_cdclk hooks to NULL was
another idea, but the hooks get initialized before VBT parsing. And I
don't think that works for init either.

BR,
Jani.



_______________________________________________
Intel-gfx mailing list
Intel-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/intel-gfx

Reply via email to