https://bugs.kde.org/show_bug.cgi?id=488281

            Bug ID: 488281
           Summary: touch screen edges broken on certain multi monitor
                    setups
    Classification: Plasma
           Product: kwin
           Version: 6.0.5
          Platform: Manjaro
                OS: Linux
            Status: REPORTED
          Severity: normal
          Priority: NOR
         Component: Gestures
          Assignee: kwin-bugs-n...@kde.org
          Reporter: luis.bue...@server23.cc
  Target Milestone: ---

SUMMARY
When setting touch screen edge gestures in a multi monitor setup with some
monitors that don't support touch, some gestures are inaccessible


STEPS TO REPRODUCE
1. Have device with touch screen input (eg: laptop with touch screen or an
external monitor with touch screen, just *something* that can use touch screen
gestures)
2. Have second monitor plugged in *without* touch support, have display
configuration of that screen eg: on the left of the touch screen.
3. Set up a touch edge gesture on the *right* edge -> this works
4. Set up a touch edge gesture on the *left* edge -> this doesn't work

OBSERVED RESULT
Some touch gestures don't work in mixed touch-non-touch multi monitor setups

EXPECTED RESULT
Touch gestures work

SOFTWARE/OS VERSIONS
KDE Plasma Version: 6.0.5
KDE Frameworks Version: 6.2.0
Qt Version: 6.7.1

Tested also on postmarketOS (PinePhone) with an external screen attached,
system info is from my laptop with touch screen though.

ADDITIONAL INFORMATION
I've found this while combing through the touch edge gesture code. This seems
to be caused because the ElectricBorder ScreenEdges are moved to the *outermost
screen geometry*, meaning the left ElectricBorder is moved to the left edge of
the leftmost screen, the right ElectricBorder is moved to the right edge of the
rightmost screen, etc.
However the ElectricBorders for the Screen Edges gestures (invoked by the mouse
cursor) are identical to the ElectricBorders in use for the Touch Screen Edge
gestures (invoked by the finger on a touch screen). This means KWin doesn't
check whether a screen *even supports touch input* when placing the
ElectricBorders - which is fine for the mouse screen edge gestures, but leads
to the screen edge potentially being moved to a screen that doesn't have touch
input, thus making it completely inaccessible.

The way I see it, the mouse screen edges and the touch screen edges probably
need to be decoupled (they can still share code the same way as now I think,
but they must be instantiated separately, one set for the mouse gestures, one
set for the touch gestures) and KWin needs to check that the touch screen edges
actually end up on screens that support touch. My gut feeling says the right UX
would be to have all touch enabled screens have duplicates of the same touch
edges (so I can do the same touch edge gestures on all screens) instead of
spanning the touch edges across the outermost (touch-enabled) screen geometry.

The place I found this was in `screenedge.cpp` in
`ScreenEdges::recreateEdges()` and following the callstack further in
`ScreenEdges::createVerticalEdge()` and `ScreenEdge::createHorizontalEdge()`.
One screen edge currently holds both the "normal" (mouse) callback as well as
the touch callback, meaning both touch edges and mouse edges share the same
ScreenEdge instance even though they are beholden to different limitations
(mouse gestures can't have edges between two displays because otherwise the
mouse can't move between displays without triggering them while touch gestures
can't be invoked on non-touch screens)

Sorry if this is all a bit rambly, I hope it makes sense.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to