Re: [RFC] kscreen and touchscreen rotation

2017-09-12 Thread Martin Flöser

Am 2017-09-11 12:56, schrieb David Edmundson:

But AFAIK actual screen rotation itself needs to be added to kwin
wayland, before we can be looking at fixing touch events.


I gave that one a try this evening. It took me about half an hour 
thinking about where to add the code, then 10 min of hacking and I had 
QPainter compositor rotated on drm platform :-)


This sounds and looks very promising. So I'm confident that we can have 
something pretty soonish. At least the standard rotation by 180 degrees 
(the 90 degrees I still don't get what one should do).


Cheers
Martin


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread Martin Flöser

Am 2017-09-11 17:11, schrieb Sebastian Kügler:

On Monday, September 11, 2017 4:49:58 PM CEST Martin Flöser wrote:

So go for the simple way and change Wayland first.


Do you think the architecture / API approach is sound?


I think your API idea covers all cases. What might be important is to 
especially focus on the case of:

* internal display with touch screen
* attaching external display

This is a common use case which currently has it's problems as the 
touchscreen coordinates are not properly translated. This is something 
we need to get right no matter whether we want to rotate the screen or 
not.


Given that it would be important to have some meta information about how 
the touch screen relates to a physical screen, e.g. the layout in the 
virtual display. So add a pointer back from the touchscreen to the 
physical screen.


For the Wayland case we hopefully don't need any of the API. KWin should 
do a sane thing without needing KScreen for it. If we have the link 
between physical screen and touch screen KWin should do the only sane 
thing when rotating the physical screen.


Btw. on Wayland KWin has all relevant information about the devices 
exposed through DBus and you can use that from KScreen side. If 
something is not yet exposed, just yell, it's easy to add.


org.kde.KWin /org/kde/KWin/InputDevice/eventXX

and then check the properties. Most important for you should be:
* bool touch
* QSize size
* QString outputName
* bool supportsCalibrationMatrix
* bool enabled

This is mostly just forwarding of libinput device configuration state. 
For more information about these: 
https://wayland.freedesktop.org/libinput/doc/latest/group__config.html#ga09a798f58cc601edd2797780096e9804


The link is directly for set calibration matrix, which would be the way 
to go for rotating a touch screen.


Cheers
Martin


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread Sebastian Kügler
On Monday, September 11, 2017 4:49:58 PM CEST Martin Flöser wrote:
> So go for the simple way and change Wayland first.

Do you think the architecture / API approach is sound?
-- 
sebas

http://www.kde.org | http://vizZzion.org


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread Martin Flöser

Am 2017-09-11 12:56, schrieb David Edmundson:

On Mon, Sep 11, 2017 at 11:58 AM, Sebastian Kügler 
wrote:


Hi all,

One of the things we talked about during Akademy was better support
for
convertible hardware. I've played around a bit with my Thinkpad X1
Yoga
and screen rotation. I can of course rotate the screen "manually"
through XRandR (Wayland is another story on that altogether), but
that
will screw up the touchscreen. Doing some research, I found out
that:

- touchscreen and display are completely separate things, the system
doesn't know they're sitting on top of each other
- They need to be rotated separately
- rotation in X goes through XInput2, on Wayland, I don't know


on wayland

touch is libinput->kwin->app  (protocol is wl_touch)

But AFAIK actual screen rotation itself needs to be added to kwin
wayland, before we can be looking at fixing touch events.


It's mutual exclusive. We can make touch rotation work without needing 
visual rotation and vice versa. Rotating visual should be quite simple 
in fact.




I'm not 100% sure the input needs transforming when rotated. If you
look at QtWaylandWindow when the screen rotates, the surface rotates
too, so if co-ordinates are surface local we shouldn't be changing it?


Yep it needs to be transformed in KWin otherwise things like touch on 
windeco would not work ;-)


Cheers
Martin


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread Martin Flöser

Am 2017-09-11 11:58, schrieb Sebastian Kügler:

Hi all,

One of the things we talked about during Akademy was better support for
convertible hardware. I've played around a bit with my Thinkpad X1 Yoga
and screen rotation. I can of course rotate the screen "manually"
through XRandR (Wayland is another story on that altogether), but that
will screw up the touchscreen. Doing some research, I found out that:

- touchscreen and display are completely separate things, the system
  doesn't know they're sitting on top of each other


It does know, it's just not properly exposed. Libinput does have the 
output information, it's just not set through udev in general, so in 
practice we don't have it. But it's in general a fixable problem which 
might need some work together with upstream and downstream. We need to 
figure out whether that is never set or just in all distros except 
Fedora (which is what I would expect).


Anyway for your design planning you should be able to assume that there 
is a link between output and touchscreen.



- They need to be rotated separately
- rotation in X goes through XInput2, on Wayland, I don't know


KWin through libinput.


It's just an idea for now, but I'd like to get some feedback about it
at this point.


My feedback would be: ignore X11, concentrate on Wayland. Once Wayland 
works, add it to X or ignore it.


Why:
1) Wayland is easier, we control the stack
2) We agreed that anything new should be Wayland first
3) Going X11 first or considering X11 at all might make the work on 
Wayland more difficult


So go for the simple way and change Wayland first.

Cheers
Martin


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread Sebastian Kügler
On Monday, September 11, 2017 12:56:09 PM CEST David Edmundson wrote:
> On Mon, Sep 11, 2017 at 11:58 AM, Sebastian Kügler 
> wrote:
> > Hi all,
> > 
> > One of the things we talked about during Akademy was better support
> > for convertible hardware. I've played around a bit with my Thinkpad
> > X1 Yoga and screen rotation. I can of course rotate the screen
> > "manually" through XRandR (Wayland is another story on that
> > altogether), but that will screw up the touchscreen. Doing some
> > research, I found out that:
> > 
> > - touchscreen and display are completely separate things, the system
> > 
> >   doesn't know they're sitting on top of each other
> > 
> > - They need to be rotated separately
> > - rotation in X goes through XInput2, on Wayland, I don't know
> 
> on wayland
> touch is libinput->kwin->app  (protocol is wl_touch)
> 
> But AFAIK actual screen rotation itself needs to be added to kwin
> wayland, before we can be looking at fixing touch events.
> 
> I'm not 100% sure the input needs transforming when rotated. If you
> look at QtWaylandWindow when the screen rotates, the surface rotates
> too, so if co-ordinates are surface local we shouldn't be changing it?
> It's quite confusing.
> 
> > I'm working on some code that does the rotation from KScreen right
> > now. My idea is to add this to KScreen's API and allow the KScreen
> > user to also rotate the display.
> > 
> > On X, this happens async, and this bears the risk that input ends
> > up on different coordinates during switching (display has already
> > rotated, touchscreen still in process, or some race condition like
> > that -- I don't think we can really prevent that), on Wayland, we
> > should be able to prevent that from happening as we hopefully can
> > make input rotation atomic along with the rotation itself, the
> > protocol and API of screen management in Wayland allow that.
> > 
> > We probably will need to guess which display has a touchscreen
> > attached, but for some cases, I think we can make some pretty
> > reasonable guesses
> > - one display, one touchscreen seems like a the most common case,
> > and is clear
> > - one touchscreen and multiple displays: touchscreen may be on the
> > internal display
> > The mapping heuristic is kept internally, and we can work on that
> > later, once we got the basics in place.
> > 
> > Architecture / API design: My idea right now is to add a new
> > relatively simple type class to the KScreen API, TouchScreen, that
> > can be received from an Output, something like
> > 
> > output->setRotation(Output::Left);
> > TouchscreenList *touchscreens = output->touchscreens();
> > for (auto touchscreen : touchscreens) {
> > 
> > touchscreens.at(0)->setRotation(Output::Left);
> > 
> > }
> > // ... then setting the new config as usual through
> > SetConfigOperation()
> 
> I don't undertand what this help with?

Changing both, input and display at the same time. This will then go
through either, xrandr or kwayland backends and use xinput or wl_touch
(as you note) to set the input rotation.

Also, the mapping will be in kscreen then.

> That will store a value, but you haven't said who's going to do
> anything with it, which is the important part.
> 
> If we're going to set it to be the same as the screen we may as well
> just have the code that handles input to read the screen rotation.

On X, that's the X server, we're expected to set it through xinput.

On Wayland, that's different, that's why I'm thinking it would have to
be backend-specific, which brings in kscreen which has the backends
already.
-- 
sebas

http://www.kde.org | http://vizZzion.org


Re: [RFC] kscreen and touchscreen rotation

2017-09-11 Thread David Edmundson
On Mon, Sep 11, 2017 at 11:58 AM, Sebastian Kügler  wrote:

> Hi all,
>
> One of the things we talked about during Akademy was better support for
> convertible hardware. I've played around a bit with my Thinkpad X1 Yoga
> and screen rotation. I can of course rotate the screen "manually"
> through XRandR (Wayland is another story on that altogether), but that
> will screw up the touchscreen. Doing some research, I found out that:
>
> - touchscreen and display are completely separate things, the system
>   doesn't know they're sitting on top of each other
> - They need to be rotated separately
> - rotation in X goes through XInput2, on Wayland, I don't know
>

on wayland
touch is libinput->kwin->app  (protocol is wl_touch)

But AFAIK actual screen rotation itself needs to be added to kwin wayland,
before we can be looking at fixing touch events.

I'm not 100% sure the input needs transforming when rotated. If you look at
QtWaylandWindow when the screen rotates, the surface rotates too, so if
co-ordinates are surface local we shouldn't be changing it?
It's quite confusing.



> I'm working on some code that does the rotation from KScreen right now.
> My idea is to add this to KScreen's API and allow the KScreen user to
> also rotate the display.
>
> On X, this happens async, and this bears the risk that input ends up on
> different coordinates during switching (display has already rotated,
> touchscreen still in process, or some race condition like that -- I
> don't think we can really prevent that), on Wayland, we should be able
> to prevent that from happening as we hopefully can make input rotation
> atomic along with the rotation itself, the protocol and API of screen
> management in Wayland allow that.
>
> We probably will need to guess which display has a touchscreen
> attached, but for some cases, I think we can make some pretty
> reasonable guesses
> - one display, one touchscreen seems like a the most common case, and
> is clear
> - one touchscreen and multiple displays: touchscreen may be on the
> internal display
> The mapping heuristic is kept internally, and we can work on that
> later, once we got the basics in place.
>
> Architecture / API design: My idea right now is to add a new relatively
> simple type class to the KScreen API, TouchScreen, that can be received
> from an Output, something like
>
> output->setRotation(Output::Left);
> TouchscreenList *touchscreens = output->touchscreens();
> for (auto touchscreen : touchscreens) {
> touchscreens.at(0)->setRotation(Output::Left);
> }
> // ... then setting the new config as usual through SetConfigOperation()
>

I don't undertand what this help with?

That will store a value, but you haven't said who's going to do anything
with it, which is the important part.

If we're going to set it to be the same as the screen we may as well just
have the code that handles input to read the screen rotation.