https://bugs.kde.org/show_bug.cgi?id=407951

--- Comment #7 from Marian Klein <mkleins...@gmail.com> ---
I noticed https://community.kde.org/KDE_Visual_Design_Group/Gestures , which is
related to this bug.

What is the difference between touchpad and touchscreen in operation?

There is no direct correspondence (size , aspect ratio) between touchpad and
screen/display so touchpad must captures delta/relative moves:
If you lift your finger from touchpad and move into another place , the pointer
does NOT jump, it continues from where you left it.

On the other hand touchscreen has to maintain direct one-to-one correspondence 
between touch sensor and display to stay in sync, so it needs to capture 
absolute positions. Jumps in finger must translate as jump in pointer.

Other than that, there is not much difference between touchpad and touchscreen.
Currently plasma (desktop area) implements :

1) Touchpad moves without holding left button as a relative change of pointer
location resulting potentially into hovering or/and change of application
focus.

2) touchpad moves with holding left button as dragging /selection.

3) application can be openned by 
  3a)left touchpad/mouse button single click or double click once the pointer
is hovering over icon
  3b) single click or double click on touchpad once the pointer is hovering
over an icon

4) touchscreen (one finger) moves (or maybe small consecutive jumps) as
dragging/selection (as if holding left mouse/touchpad button) 

To treat both touchpad and touchscreen in the same fashion.

Can we have a virtual on screen "sticky left touchscreen button" on plasma to
change between default hovering/focus change mode  and  selection/dragging
mode?

This way we can unify touchpad and touchscreen behaviour and reuse touchpad
settings for touchscreen as generic touch sensors devices.

-- 
You are receiving this mail because:
You are watching all bug changes.

Reply via email to