Re: Wayland Relative Pointer API Progress

2015-07-20 Thread Bill Spitzak
On Sun, Jul 19, 2015 at 6:06 AM, x414e54 x414...@linux.com wrote:

This seems to be getting WAY off course.

My request is for set_cursor_position_hint to actually move the cursor,
rather than forcing the client to make a fake cursor. It could very well
move the cursor without syncing with the drawing.

I screwed up when I suggested that my proposal could also be synced with
drawing (by making the set_cursor_position be latched to wl_surface
commit). For some reason this then turned into a long argument, continued
here apparently, about whether this is a good idea or not. Maybe it is not.
But that is irrelevant because the alternative of a fake cursor FORCES you
to use sync update!

Even with subsurfaces GUI applications really should not push their
 own cursors around as they cannot control the latency between the user
 moving the mouse and receiving the event. Think about x11 forwarding
 where the window is actually composited locally but running on a
 remote server. You would loose the benefit over using a frame based
 protocol such as VNC. It is not abnormal to have 50-100ms latency to a
 remote server. Sure you are always going to have a lag once you click
 a button and wait for the remote application to respond but at least
 the user knows where the mouse was when they clicked.


Yes, obviously. This is why I am interested in a pointer-lock that only
works when the mouse button is held down. I am guessing that async
highlight and tooltip update is not annoying enough to be worth the delayed
and intermittent cursor motion that requiring the cilent to always position
the cursor would have.

When dragging on high latency, clients already have to deal with trying to
guess what graphic the user was looking at when they released the button.
This is not the xy of the button-release event, and might not even be the
last drawing that was sent. It is really hard. However my proposal at least
stops the cursor from getting ahead and further confusing the user about
where the release happens.


 Also what Bill was talking about was syncing the exact position of the
 mouse to the rendered graphics (running a subsurface in synced mode)
 which not only means the mouse will be skipping frames you will have
 issues because SwapBuffers is asynchronous so you could end up
 overwriting the state cache for the cursor position. You would have to
 stick a fence in the gl command queue and wait before you called
 set_position. In this case your graphics pipeline would be faster if
 you just did a blit or texture draw because there is no CPU/GPU sync
 required.


It sounds like you are saying that there is hardware where it is
impossible, or at least difficult or slow, to make the cursor move in sync
with the swap that updates the composited desktop. If this is true it means
the compositor cannot use the cursor hardware for a subsurface, so in fact
my proposal (without sync) will provide a more efficient way of moving the
cursor around in pointer-lock mode.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-19 Thread x414e54
On Wed, Jul 15, 2015 at 9:09 PM, Daniel Stone dan...@fooishbar.org wrote:
 I haven't read the vast majority of this thread, but this isn't really
 true. There's nothing special about the cursor in drawing: just hide
 the cursor, create a small subsurface with SHM buffers, and Weston
 will use the hardware cursor as it would for an actual cursor surface.
 Problem solved.

 Cheers,
 Daniel

Even with subsurfaces GUI applications really should not push their
own cursors around as they cannot control the latency between the user
moving the mouse and receiving the event. Think about x11 forwarding
where the window is actually composited locally but running on a
remote server. You would loose the benefit over using a frame based
protocol such as VNC. It is not abnormal to have 50-100ms latency to a
remote server. Sure you are always going to have a lag once you click
a button and wait for the remote application to respond but at least
the user knows where the mouse was when they clicked.

Also what Bill was talking about was syncing the exact position of the
mouse to the rendered graphics (running a subsurface in synced mode)
which not only means the mouse will be skipping frames you will have
issues because SwapBuffers is asynchronous so you could end up
overwriting the state cache for the cursor position. You would have to
stick a fence in the gl command queue and wait before you called
set_position. In this case your graphics pipeline would be faster if
you just did a blit or texture draw because there is no CPU/GPU sync
required.

This is kind of giving me headache now so I will make this my last comment here.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-16 Thread Bill Spitzak
On Wed, Jul 15, 2015 at 5:09 AM, Daniel Stone dan...@fooishbar.org wrote:

 On 29 June 2015 at 20:22, Bill Spitzak spit...@gmail.com wrote:
  On Sun, Jun 28, 2015 at 7:32 AM, x414e54 x414...@linux.com wrote:
  Clients do not draw the mouse cursor, either the GPU (using hardware
  overlays) or the WM does.
 
  Yes, I want to allow clients to use this CPU/WM support. Currently the
  client *has* to draw the cursor (and hide the hardware one) to sync it's
  position with the drawing. If instead the client just *moves* the cursor,
  then the cursor is actually drawn by the compositor like you say.

 I haven't read the vast majority of this thread, but this isn't really
 true. There's nothing special about the cursor in drawing: just hide
 the cursor, create a small subsurface with SHM buffers, and Weston
 will use the hardware cursor as it would for an actual cursor surface.
 Problem solved.


The problem I have with that is that it is impossible to prevent an
incorrect composite where there is either no cursor or two cursors rendered.

The wayland api could be changed so that setting a cursor image is buffered
until a matching wl_surface commit, and this could fix the entry into
pointer-lock mode. However when locked mode is lost I don't see any way to
do it, as it may be a different client setting the cursor image than the
one that would be removing the subsurface with the fake cursor in it.

This is probably not a big deal except (as far as I can tell) it can be
fixed with a trivial change to the proposed pointer-lock. The current
proposal makes the cursor stop moving (which means the client has to hide
the cursor in any conceivable usage) and adds a set_cursor_hint request
that is used to set the xy position the cursor will be at when pointer lock
is lost. I want set_cursor_hint to actually move the cursor (and possibly
rename it to remove the word hint). If the client does not want that it
can hide the cursor, which it has to do anyway in the current proposal.

For some reason this has degraded into a big argument about whether locking
the cursor position to the rendered graphics is good or bad. But since the
alternative of using a subsurface would also do that, it does not matter
whether that is good or bad for deciding this.

This change combined with a pointer lock that lasts the exact same time
that the implicit grab for a button down lasts would be very useful to make
a slow scrollbar or a 3-D trackball where the cursor sticks to the
rendered ball's surface. Reusing an api designed for games for this seems
like a good idea to reduce Wayland's api size.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-15 Thread Daniel Stone
On 29 June 2015 at 20:22, Bill Spitzak spit...@gmail.com wrote:
 On Sun, Jun 28, 2015 at 7:32 AM, x414e54 x414...@linux.com wrote:
 Clients do not draw the mouse cursor, either the GPU (using hardware
 overlays) or the WM does.

 Yes, I want to allow clients to use this CPU/WM support. Currently the
 client *has* to draw the cursor (and hide the hardware one) to sync it's
 position with the drawing. If instead the client just *moves* the cursor,
 then the cursor is actually drawn by the compositor like you say.

I haven't read the vast majority of this thread, but this isn't really
true. There's nothing special about the cursor in drawing: just hide
the cursor, create a small subsurface with SHM buffers, and Weston
will use the hardware cursor as it would for an actual cursor surface.
Problem solved.

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-14 Thread Bill Spitzak
On Mon, Jul 13, 2015 at 9:48 PM, x414e54 x414...@linux.com wrote:


 Okay lets say for whatever reason you allow the client direct access
 to set the cursor position in a way that did not introduce any extra
 lag.


I'm going to ignore all this, because it is obvious that what I propose has
lag. Avoiding lag is impossible. It is disingenious of you to try to add
an impossible requirement to my proposal and then say your proposal is
stupid because this requirement I added is impossible!

The ONLY difference between my proposal and pointer lock is that the
cursor image moves to where the cursor_hint as you call it is. Your
proposal requires the client to hide the cursor. Mine will be IDENTICAL if
the client hides the cursor, but leaves open the possibility of NOT hiding
the cursor. THAT'S IT!

You STILL cannot sync a scrollbar with the mouse position because the
 event loop thread should not be tied to the render loop thread and the
 call to swap buffers would be out of sync with the call to set
 position.


The cursor is moved by the render loop, not the event loop.

Either way your event loop is now completely tied into your render
 loop, meaning your render loop will also introduce extra latency to
 your event loop. Lets be extreme and say the render loop updates at
 1fps on my computer (because it is really slow, something out of your
 application's control).


If the render takes one second, then the reaction to an event will be
delayed by one second. Pushing a button or moving a slider will not show
the new slider or button until one second later. It does not matter whether
pointer-lock is used, and it does not matter whether you have to hide the
cursor when using pointer-lock or you can move it (as I propose). All of
this is entirely irrelevant.

The mouse cursor now moves at 1fps and every-time I click nothing
 happens for a whole second.


Yes that is EXACTLY how it should work. Nothing happens for one second. And
with my proposal the mouse cursor does not slide around atop the thing that
is not moving, instead it looks a lot more like you really grabbed it. This
is exactly what I want.

If you don't think so, just don't use the pointer lock. That is not too
hard.

How is this smooth?


It's not smooth. Obviously. But it has the optical illusion of smoothness
because the cursor does not slide around atop the jumping image, instead
they jump together. I recommend you try this with a fake program if you
don't believe me.

I really could not trust that any application would be bothered to
 implement this correctly.


I don't trust any application to implement *anything* correctly. That is
not a good argument.


 An event loop in most applications is separate from the render loop so
 you cannot use ping/pong events.


The render loop would be sending the cursor-positioning request.

You also cannot use an update to cursor position because the cursor
 genuinely may not be moving.


You can send the cursor position request with the same coordinates as the
last one you sent.


 Also you cannot use a buffer update because the applications should
 not be required to commit a buffer that has not changed just to
 move/not move the cursor.


This is true of everything that is synced with a wl_surface commit request.
I should check but I believe the requirement to change the buffer to get
other effects has been fixed. If not it should be.

Furthermore you are putting too much trust into a perfectly performing
 client which are much harder to implement in practice than theory.


I have no idea what you mean by this. If the client has succeeded in
setting up pointer lock I think we can assume it is performing.


 I am really not sure you understand how computers work. Are you some
 how under the impression that latency does not exist?


Yes of course latency exists. The existence of latency is one of the
reasons for my proposal (I also want to allow non-linear movement of the
cursor). You seem to be saying any idea that does not remove latency is
wrong. But I'm afraid that is 100% of Wayland. The only way to remove all
latency is to do nothing.

Some older systems the client probably is involved in moving a window.
 If they do not have double buffering and draw directly to the onscreen
 frame-buffer for example.


Yes early Windows and pre-compositing X did this. It has nothing to do with
my proposal and Wayland does not work this way.

The reason I was discussing window movement is that I am arguing that the
reason it looks smooth is because Windows (and current implementations of
Wayland) move the hardware cursor in sync with the composited image, and
this position of the cursor and the position of the dragged window are
updated from the same motion event (which could have been a very long time
ago), and thus they are locked together.

This is not true of X11 as the window is moved by the window manager
process and is not in sync with the cursor being moved.

This is not visible on modern machines as they 

Re: Wayland Relative Pointer API Progress

2015-07-14 Thread Bill Spitzak
On Tue, Jul 14, 2015 at 4:14 PM, x414e54 x414...@linux.com wrote:

 You are either just trolling or have no idea what you are talking
 about and have never tried to implement anything you are saying.


 I stopped listening at The render loop would be sending the
 cursor-positioning request.. Not sure how you cannot see the problem
 with this.


   static atomic cursor_xy;

   event_loop {
  read_event();
  cursor_xy = cursor_xy_from_events();
   }

   render_loop {
   set_cursor_position(cursor_xy);
   }

Sorry I don't see any problem. Are you concerned that the number of
set_cursor_position calls may differ from the number of mouse movements?

I will be blocking your e-mail address and disregarding any input you
 have in any future discussion. It is just a waste of time to continue
 this discussion.


Whatever.


 Sorry, I am actually genuinely trying to improve the protocol for some
 of the ways games currently work based on my Windows to Linux porting
 experience. Also trying to make it easier to support 3d input devices
 and 3d compositors using something like the HTC Vive. My suggestions
 may not be the right ones but they seriously better than some of the
 junk you have suggested.


I appreciate that and think the pointer lock is really close, and
experience with games is very useful to make an api that serves multiple
purposes. All I am trying to do is make a trivial change to your proposal
so that the pointer-lock can be used to make a slow cursor or a
trackball-locked cursor as well as supporting games. My change to your
proposal is exactly this:

- The cursor hint actually moves the cursor!

Since your proposal requires the client to hide the cursor, and there is no
visible difference between what a user sees when an invisible cursor is
moved, my change makes no difference to any program that is currently using
the pointer lock.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-14 Thread x414e54
You are either just trolling or have no idea what you are talking
about and have never tried to implement anything you are saying.

I stopped listening at The render loop would be sending the
cursor-positioning request.. Not sure how you cannot see the problem
with this.

I will be blocking your e-mail address and disregarding any input you
have in any future discussion. It is just a waste of time to continue
this discussion.

Sorry, I am actually genuinely trying to improve the protocol for some
of the ways games currently work based on my Windows to Linux porting
experience. Also trying to make it easier to support 3d input devices
and 3d compositors using something like the HTC Vive. My suggestions
may not be the right ones but they seriously better than some of the
junk you have suggested.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-07-13 Thread x414e54
On Tue, Jun 30, 2015 at 4:22 AM, Bill Spitzak spit...@gmail.com wrote:


 On Sun, Jun 28, 2015 at 7:32 AM, x414e54 x414...@linux.com wrote:


 Clients do not draw the mouse cursor, either the GPU (using hardware
 overlays) or the WM does.


 Yes, I want to allow clients to use this CPU/WM support. Currently the
 client *has* to draw the cursor (and hide the hardware one) to sync it's
 position with the drawing. If instead the client just *moves* the cursor,
 then the cursor is actually drawn by the compositor like you say.


Okay lets say for whatever reason you allow the client direct access
to set the cursor position in a way that did not introduce any extra
lag.

Lets say the client gets direct access to the GPU to set a hardware
overlay position (I think hardware cursors are currently disabled due
to some intel vsync issue but this is does not effect the point).

Completely ignoring any security issues with direct access and any
translation issues from transformed or scaled windows or mice which do
not move in two dimensions.

You STILL cannot sync a scrollbar with the mouse position because the
event loop thread should not be tied to the render loop thread and the
call to swap buffers would be out of sync with the call to set
position.

Even if you decided putting it all onto one thread or waited you would
still have have issues where vsync refreshed the cursor position
before the window or the processes was preempted. As the calls are not
atomic.

As there is no specific egl api for cursors anyway you would have to
drop hardware cursors and draw your own software cursor on top of your
surface.

Either way your event loop is now completely tied into your render
loop, meaning your render loop will also introduce extra latency to
your event loop. Lets be extreme and say the render loop updates at
1fps on my computer (because it is really slow, something out of your
application's control).
The mouse cursor now moves at 1fps and every-time I click nothing
happens for a whole second.

How is this smooth?

I really could not trust that any application would be bothered to
implement this correctly.


 It is a bit of an extreme case example but:

 1) User moves mouse and WM adds to clients queue then blocks mouse
 movement.
 2) The application's render thread hangs but event thread continues.
 3) Mouse does not move.


 Lets add the next steps:

  4) Compositor does not get a cursor position or pong request in the next
 1/10 second or so.
  5) Compositor draws the cursor in the new position just like it does now

 I am just trying to allow clients to take advantage of sync if they are
 capable of it. Falling back to the previous behaviour is acceptable. In fact
 this is better than clients hiding the cursor and drawing their own in that
 such fallbacks work. A frozen client drawing it's own cursor means the
 cursor will appear only when the user moves it out of the window, and at a
 pretty unpredictable place, and the old cursor image will still be there on
 screen.

An event loop in most applications is separate from the render loop so
you cannot use ping/pong events. The main point of the ping/pong is to
pop up the force quit dialog if the application stops processing input
(because the user would be unable to click close), not pauses in
rendering output.

You also cannot use an update to cursor position because the cursor
genuinely may not be moving. The compositor may be able to see the
relative motion event but it does not know if the client wants to move
the cursor or not.

Also you cannot use a buffer update because the applications should
not be required to commit a buffer that has not changed just to
move/not move the cursor.

Furthermore you are putting too much trust into a perfectly performing
client which are much harder to implement in practice than theory.


 There should be no difference between when moving a window and not
 because the client is not involved directly.


 Whether a client is involved directly should not be visible to a user.

I am really not sure you understand how computers work. Are you some
how under the impression that latency does not exist?

Some older systems the client probably is involved in moving a window.
If they do not have double buffering and draw directly to the onscreen
frame-buffer for example.

The choice here is either wait for the client and have laggy window
moving or just draw whatever junk was in the framebuffer. I guess an
example of this on older oses was when you moved the window and ended
up with the decoration or the window above rendered partially inside a
hung application.

That would be even worse in your scenario because if the application
below lagged then the window on top would not be allowed to move until
the application below timed out.

Either way it is pretty obvious to the user that the client is
involved directly.

 I'm sure this works this way on Windows due to simplistic implementation of
 the compositor, and that is why resizing 

Re: Wayland Relative Pointer API Progress

2015-06-29 Thread Pekka Paalanen
On Tue, 23 Jun 2015 11:57:23 -0700
Bill Spitzak spit...@gmail.com wrote:

 On Sun, Jun 21, 2015 at 11:46 PM, x414e54 x414...@linux.com wrote:
 

  So in your system the compositor would have to block mouse cursor
  movement until an application had dispatched and processed its events.
  Literally in this instance the user would be moving the physical mouse
  and nothing happening on screen
 
 
 Yes, despite your attempt to be condescending, you are accurately
 describing EXACTLY what I want!
 
 nothing happening on the screen is BETTER than the wrong thing is drawn
 on the screen.

No, that is not a universal fact. It depends.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-06-29 Thread Bill Spitzak
On Sun, Jun 28, 2015 at 7:32 AM, x414e54 x414...@linux.com wrote:


 Clients do not draw the mouse cursor, either the GPU (using hardware
 overlays) or the WM does.


Yes, I want to allow clients to use this CPU/WM support. Currently the
client *has* to draw the cursor (and hide the hardware one) to sync it's
position with the drawing. If instead the client just *moves* the cursor,
then the cursor is actually drawn by the compositor like you say.


 It is a bit of an extreme case example but:

 1) User moves mouse and WM adds to clients queue then blocks mouse
 movement.
 2) The application's render thread hangs but event thread continues.
 3) Mouse does not move.


Lets add the next steps:

 4) Compositor does not get a cursor position or pong request in the next
1/10 second or so.
 5) Compositor draws the cursor in the new position just like it does now

I am just trying to allow clients to take advantage of sync if they are
capable of it. Falling back to the previous behaviour is acceptable. In
fact this is better than clients hiding the cursor and drawing their own in
that such fallbacks work. A frozen client drawing it's own cursor means the
cursor will appear only when the user moves it out of the window, and at a
pretty unpredictable place, and the old cursor image will still be there on
screen.

There should be no difference between when moving a window and not
 because the client is not involved directly.


Whether a client is involved directly should not be visible to a user.
Yes I'm sure this works this way on Windows due to simplistic
implementation of the compositor, and that is why resizing does not work
this way (as you noticed). However that does not negate the fact that this
makes the window positioning look better, even on underpowered computers.
What I am proposing is that clients be able to take advantage of this to
make other interactions besides window dragging be perceptually smoother.

For native GUIs Windows uses a completely different system of drawing
 based on WM_PAINT events. I believe even scrollbars are drawn and
 moved server side (by the WM), clients do not render scrollbars
 themselves. This would be similar to the (over the top) subsurface
 style approach I suggested earlier.


No that is not how it works. All rendering is done on the client side in
modern Windows. It is done by system libraries but they are running in the
client process and drawing into a memory-mapped image just like Wayland
does.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-06-28 Thread x414e54
*facepalm*

On Wed, Jun 24, 2015 at 3:57 AM, Bill Spitzak spit...@gmail.com wrote:
 I am not introducing any lag. If the client takes 1/2 second to draw the
 result of a mouse movement, it will still take 1/2 second under my scheme.
 Exactly the same amount of lag. However in my scheme the cursor will be
 drawn in the correct place atop the dragged object, thus satisfying the
 every frame is perfect requirement while not changing anything about
 lag.

Clients do not draw the mouse cursor, either the GPU (using hardware
overlays) or the WM does.



 So in your system the compositor would have to block mouse cursor
 movement until an application had dispatched and processed its events.
 Literally in this instance the user would be moving the physical mouse
 and nothing happening on screen


 Yes, despite your attempt to be condescending, you are accurately describing
 EXACTLY what I want!

I was not being condescending... I am just not sure how you cannot see
why this would not work?

It is a bit of an extreme case example but:

1) User moves mouse and WM adds to clients queue then blocks mouse movement.
2) The application's render thread hangs but event thread continues.
3) Mouse does not move.


There are plenty of other issues such as the application has to load
or download something before it can render the next frame blocking the
mouse causing jerkiness and lag.

Even hiding the mouse can be irritating for users but not as
irritating as the mouse not moving for 1-2s then suddenly appearing
over the other side of the screen.


 It does not work which is why no Window Managers ever do or should do
 this.


 Really? Simple experiments show that Windows does this when dragging a
 window (drag a window really fast and spot where it drew the cursor, it
 draws is many fewer times than if you move the mouse the same speed without
 dragging, and perfectly locked to the window). This is almost 100% of the
 reason people think Windows drags windows more smoothly than X11.



Maybe your computer is just underpowered so the WM drops fps?

There should be no difference between when moving a window and not
because the client is not involved directly.

I state it again there are no window managers that block mouse
movement waiting for a client.

Take the example of window resizing as that is much easier to notice.
OSX, Window, X11 and Wayland (Gnome Shell) do not sync the mouse with
the new window size.


For native GUIs Windows uses a completely different system of drawing
based on WM_PAINT events. I believe even scrollbars are drawn and
moved server side (by the WM), clients do not render scrollbars
themselves. This would be similar to the (over the top) subsurface
style approach I suggested earlier.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-06-23 Thread Bill Spitzak
On Sun, Jun 21, 2015 at 11:46 PM, x414e54 x414...@linux.com wrote:

 Hi I have been away for a while and quite busy so I did not get a
 chance to response.

 On Tue, Apr 28, 2015 at 3:46 AM, Bill Spitzak spit...@gmail.com wrote:
  No, I absolutely 100% disagree.
 
  Synchronized updating so things don't vibrate or shift is more important
  than FPS. It is even stated as part of Wayland's basic design criteria:
  every frame is perfect.

 Wow... no seriously...

 This is wrong on so many levels and probably shows you either have
 never used either Weston or Gnome Shell when they had laggy slow mouse
 issues.


You do not seem to be understanding in the least what I am asking for.

There is one key word in there lag.


I am not introducing any lag. If the client takes 1/2 second to draw the
result of a mouse movement, it will still take 1/2 second under my scheme.
Exactly the same amount of lag. However in my scheme the cursor will be
drawn in the correct place atop the dragged object, thus satisfying the
every frame is perfect requirement while not changing anything about
lag.


 So in your system the compositor would have to block mouse cursor
 movement until an application had dispatched and processed its events.
 Literally in this instance the user would be moving the physical mouse
 and nothing happening on screen


Yes, despite your attempt to be condescending, you are accurately
describing EXACTLY what I want!

nothing happening on the screen is BETTER than the wrong thing is drawn
on the screen.

Just like all other interaction such as window raising the compositor can
certainly time out and fake it if it appears the client is dead. I also
thought it might be ok to limit sync mouse cursor to times when the button
is down, and make it optional for the client, by some slight modifications
to the proposed pointer lock api. But you may be correct (even though you
are pertending you are wrong) that it is desirable even when moving the
mouse, as it would allow the program to highlight fine detail hit targets.
This is useful in complex geometry to make it easier to pick things that
are close together. Current programs either require the user to hold the
mouse still for a split second to make sure the correct thing will be
picked on click, or complex things like Maya typically draw their own
manipulator to force sync.

. Which is why (I assume) for moving
 windows the compositor does 100% of the work.


The compositor is doing the window moving so that it can implement
snapping. The client does not have sufficient information to do this.

I sure hope it is not because somebody said Wayland is too slow to do this
any other way. That would be really sad if the developers of Wayland
thought that.

Because if the
 application was involved there would be lagging movement whilst the
 application was switched out or doing something else.


The application is involved already. Window dragging will not start until
the application responds with the drag request.


 It does not work which is why no Window Managers ever do or should do this.


Really? Simple experiments show that Windows does this when dragging a
window (drag a window really fast and spot where it drew the cursor, it
draws is many fewer times than if you move the mouse the same speed without
dragging, and perfectly locked to the window). This is almost 100% of the
reason people think Windows drags windows more smoothly than X11.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-06-23 Thread Pekka Paalanen
On Mon, 22 Jun 2015 15:46:41 +0900
x414e54 x414...@linux.com wrote:

 This is wrong on so many levels and probably shows you either have
 never used either Weston or Gnome Shell when they had laggy slow mouse
 issues.
 I believe Gnome has partially fixed it but there was still massive lag
 when entering/exiting some Windows and it changed over the image last
 time I checked.
 This means every so often the mouse got stuck then flew across the screen.

FWIW, Weston does not block pointer motion on client response. If a
client is lagging, the effect you will see[*] is that the cursor will
keep on moving as normal, but the cursor image will change only when
the client catches up (e.g. from arrow to hand). If the pointer is
already gone when the client catches up, that client cannot change
the cursor anymore (implemented with input serials).

That's the exception we make to the every frame is perfect rule: the
compositor cannot wait for arbitrary clients.

If the client is the one drawing and moving the cursor, the rules
change: the user is on the mercy of the client's responsiveness. Then
the every frame is perfect rule is in full strength: all state updates
the client sends need a way to be applied atomically to avoid the
possibility of transient visual glitches.

Which mode one should pick is a whole another question. Obviously you
can't have the benefits of both without requiring real-time
responsiveness from the client, in which case the choice becomes
mostly irrelevant.


Thanks,
pq

[*] There are other reasons that may stall Weston's output: a heavy GPU
task in a client can slow everything down. Those are a very different
matter and cause.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-06-22 Thread x414e54
Hi I have been away for a while and quite busy so I did not get a
chance to response.

On Tue, Apr 28, 2015 at 3:46 AM, Bill Spitzak spit...@gmail.com wrote:
 No, I absolutely 100% disagree.

 Synchronized updating so things don't vibrate or shift is more important
 than FPS. It is even stated as part of Wayland's basic design criteria:
 every frame is perfect.

Wow... no seriously...

This is wrong on so many levels and probably shows you either have
never used either Weston or Gnome Shell when they had laggy slow mouse
issues.
I believe Gnome has partially fixed it but there was still massive lag
when entering/exiting some Windows and it changed over the image last
time I checked.
This means every so often the mouse got stuck then flew across the screen.

Usually this is what happens when you have people who do not actually
consider usability studies and HFE.


If you really want to start a quote war there is also the quote:
 by which I mean that applications will be able to control the
rendering enough that we'll never see tearing, lag, redrawing or
flicker..

There is one key word in there lag. Lag is an inevitable part of
computing you cannot guarantee when an application will get CPU time
to processes events and sync with the compositor.

So in your system the compositor would have to block mouse cursor
movement until an application had dispatched and processed its events.
Literally in this instance the user would be moving the physical mouse
and nothing happening on screen. Which is why (I assume) for moving
windows the compositor does 100% of the work. Because if the
application was involved there would be lagging movement whilst the
application was switched out or doing something else.

It does not work which is why no Window Managers ever do or should do this.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-27 Thread Bill Spitzak



On 04/24/2015 08:51 PM, x414e54 wrote:

On Fri, Apr 24, 2015 at 6:22 PM, Pekka Paalanen ppaala...@gmail.com wrote:

Then apps would need to know what the accel paramaters mean, which
requires us to lock down the acceleration logic and mathematical
functions in the protocol specification. I don't think that is
something Wayland wants to specify.


I intended acceleration would just be a bool on/off.


As a practical matter the applications that want a slow scrollbar want
*exactly* the same acceleration as before, just that the resulting position
is scaled down. Forcing them to replicate the code that the compositor is
doing seem mistaken.


Speed would be as a scale/percent and does not effect acceleration.
You could call it whatever, movement scale, etc, so this should be
acceptable?


I believe we are both questioning the idea that you use the raw device 
api to grab the mouse, as this would apparently bypass the 
acceleration code. It sounds like you are proposing keeping the 
acceleration result somehow accessable.


A boolean acceleration on/off will not work because it is asynchronous. 
Both the accelerated and unaccelerated positions must be available 
directly as events (though it is ok if you don't get the unaccelerated 
ones until you turn on grab and it then replays any missing ones since 
the grabbing event.



On Sat, Apr 25, 2015 at 5:06 AM, Bill Spitzak spit...@gmail.com wrote:

That is not sufficient as the application may want to put much more complex
rules on how the pointer moves (such as limiting it to a rectangle).


Limiting to a specific rectangle is slightly different as you cannot
do that by warping a cursor.


Sure you can. Clamp the x and y to be inside the rectangle and warp the 
cursor to that position.


Possibly you are thinking that the warped cursor position somehow 
effects the events? That may explain why the really simple api I am 
proposing seems to be confusing people.


What I want when a grab happens is the events continue UNCHANGED. You 
get exactly the same events including xy and relative motion, no matter 
how you move the cursor around. The cursor image is completely unrelated 
to the position being used to produce events.



If you could change the cursor to drag including the scrollbar and
then confine the cursor to an area, also including the speed settings
this should allow the compositor todo everything.

Something like all in one api call:

wl_pointer::start_drag

serial
 uint - serial of the implicit grab on the origin
surface
 wl_surface - image to drag under the cursor
speed
 fixed_t - movement scale
acceleration
 enum - turn acceleration on off for this drag.
confinement_region
 wl_region - cursor movement region within the draggable surface
(allows null)
drag_region
 wl_region - movement region of the draggable surface (allows null)


That is vastly too complicated. In particular it requires the client to 
draw the scrollbar slider using a subwindow, thus making support of this 
go deep into the low level drawing routines of the client, which is 
really painful. Also does not account for any kind of intermittent 
movement of the scrollbar.



I think you would agree a uniform FPS is more important than in-sync scrollbars?


No, I absolutely 100% disagree.

Synchronized updating so things don't vibrate or shift is more important 
than FPS. It is even stated as part of Wayland's basic design criteria: 
every frame is perfect.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-24 Thread Pekka Paalanen
On Fri, 24 Apr 2015 17:59:04 +0900
x414e54 x414...@linux.com wrote:

 If you allowed applications to control the mouse speed or acceleration
 when they have an implicit grab then when it is released it returns to
 normal. This would give GUI applications the ability to use slow
 scrollbars without having to warp the pointer. Absolute pointing
 devices could then just ignore the speed change.

Then apps would need to know what the accel paramaters mean, which
requires us to lock down the acceleration logic and mathematical
functions in the protocol specification. I don't think that is
something Wayland wants to specify.

If you have the compositor in charge of moving the cursor, and the app
in charge of scrolling and drawing the contents, you would get a lag
between the cursor moving and the content/scrollbar moving. Granted,
this is probably a minor issue and can happen in other scenarios. We
did manage to avoid the unsync with window moves, though.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-24 Thread x414e54
If you allowed applications to control the mouse speed or acceleration
when they have an implicit grab then when it is released it returns to
normal. This would give GUI applications the ability to use slow
scrollbars without having to warp the pointer. Absolute pointing
devices could then just ignore the speed change.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-24 Thread x414e54
On Fri, Apr 24, 2015 at 6:22 PM, Pekka Paalanen ppaala...@gmail.com wrote:
 Then apps would need to know what the accel paramaters mean, which
 requires us to lock down the acceleration logic and mathematical
 functions in the protocol specification. I don't think that is
 something Wayland wants to specify.

I intended acceleration would just be a bool on/off.

 As a practical matter the applications that want a slow scrollbar want
 *exactly* the same acceleration as before, just that the resulting position
 is scaled down. Forcing them to replicate the code that the compositor is
 doing seem mistaken.

Speed would be as a scale/percent and does not effect acceleration.
You could call it whatever, movement scale, etc, so this should be
acceptable?

On Sat, Apr 25, 2015 at 5:06 AM, Bill Spitzak spit...@gmail.com wrote:
 That is not sufficient as the application may want to put much more complex
 rules on how the pointer moves (such as limiting it to a rectangle).

Limiting to a specific rectangle is slightly different as you cannot
do that by warping a cursor.

You do need an actual confinement for this.

On Sat, Apr 25, 2015 at 5:10 AM, Bill Spitzak spit...@gmail.com wrote:


 On 04/24/2015 02:22 AM, Pekka Paalanen wrote:

 If you have the compositor in charge of moving the cursor, and the app
 in charge of scrolling and drawing the contents, you would get a lag
 between the cursor moving and the content/scrollbar moving. Granted,
 this is probably a minor issue and can happen in other scenarios. We
 did manage to avoid the unsync with window moves, though.


If you could change the cursor to drag including the scrollbar and
then confine the cursor to an area, also including the speed settings
this should allow the compositor todo everything.

Something like all in one api call:

wl_pointer::start_drag

serial
uint - serial of the implicit grab on the origin
surface
wl_surface - image to drag under the cursor
speed
fixed_t - movement scale
acceleration
enum - turn acceleration on off for this drag.
confinement_region
wl_region - cursor movement region within the draggable surface
(allows null)
drag_region
wl_region - movement region of the draggable surface (allows null)


Obviously it is a bit over the top but would allow the compositor to
draw the pointer without having to rely on the application's render
and input loops.

 That's a good point: it means applications want to be able to control the
 cursor position during a drag even if they are *not* doing a slow
 scrollbar. This is the only way to make sure the cursor stays in sync with
 the graphic being moved.

It is a trade off between a uniform cursor movement and FPS across all
applications and un-synced scrollbars. Or a cursor in sync with
scrollbars but really slow and jerky fps software cursor (like in
GNOME shell Wayland backend).

I think you would agree a uniform FPS is more important than in-sync scrollbars?
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-24 Thread Bill Spitzak
That is not sufficient as the application may want to put much more 
complex rules on how the pointer moves (such as limiting it to a rectangle).


As a practical matter the applications that want a slow scrollbar want 
*exactly* the same acceleration as before, just that the resulting 
position is scaled down. Forcing them to replicate the code that the 
compositor is doing seem mistaken.


The only thing that will work is that the application can directly set 
the cursor location while it has the grab.


On 04/24/2015 01:59 AM, x414e54 wrote:

If you allowed applications to control the mouse speed or acceleration
when they have an implicit grab then when it is released it returns to
normal. This would give GUI applications the ability to use slow
scrollbars without having to warp the pointer. Absolute pointing
devices could then just ignore the speed change.


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-24 Thread Bill Spitzak



On 04/24/2015 02:22 AM, Pekka Paalanen wrote:


If you have the compositor in charge of moving the cursor, and the app
in charge of scrolling and drawing the contents, you would get a lag
between the cursor moving and the content/scrollbar moving. Granted,
this is probably a minor issue and can happen in other scenarios. We
did manage to avoid the unsync with window moves, though.


That's a good point: it means applications want to be able to control 
the cursor position during a drag even if they are *not* doing a slow 
scrollbar. This is the only way to make sure the cursor stays in sync 
with the graphic being moved.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-22 Thread x414e54
On Wed, Apr 22, 2015 at 1:52 PM, Peter Hutterer
peter.hutte...@who-t.net wrote:
 The real problem regarding the mouse position is that you now rely on the
 client and the compositor to calculate the cursor position exactly the same
 way. If not, you may end up leaving the window when the cursor as drawn by
 the client is nowhere near an edge (think of in-game menus).

I am not sure it is much different to what we have today:

If they are using it for a cursor for 2D in-game menus or a remote
client it really should be using the absolute wl_pointer position
driven by all of the SlavePointers in X11.
e.g. MotionNotify

When you open a device for raw access you intend that you will not be
paying attention to acceleration or normalization so you cannot draw a
cursor synced with the system.
e.g. XI_RawMotion

Currently on X11 I believe you have to grab the CorePointer always?
But it would be much nicer if you could just detach/reattach the
single SlavePointer if the game was going to use that for non cursor
driven input.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-22 Thread x414e54
On Thu, Apr 23, 2015 at 1:51 PM, x414e54 x414...@linux.com wrote:
 On Wed, Apr 22, 2015 at 1:52 PM, Peter Hutterer
 peter.hutte...@who-t.net wrote:
 The real problem regarding the mouse position is that you now rely on the
 client and the compositor to calculate the cursor position exactly the same
 way. If not, you may end up leaving the window when the cursor as drawn by
 the client is nowhere near an edge (think of in-game menus).

 I am not sure it is much different to what we have today:

 If they are using it for a cursor for 2D in-game menus or a remote
 client it really should be using the absolute wl_pointer position
 driven by all of the SlavePointers in X11.
 e.g. MotionNotify

 When you open a device for raw access you intend that you will not be
 paying attention to acceleration or normalization so you cannot draw a
 cursor synced with the system.
 e.g. XI_RawMotion

 Currently on X11 I believe you have to grab the CorePointer always?
 But it would be much nicer if you could just detach/reattach the
 single SlavePointer if the game was going to use that for non cursor
 driven input.

I guess there is the situation also where you want to constrain the
wl_pointer the window such as scrolling a scene based on proximity to
the edge of the window in an RTS.

Some games are nice about this in that they unlock the pointer when
the game is paused but some just grab and confine the pointer always
meaning there is no way to reposition the window without doing some
kind of alt-tab dance. Then when you do alt tab they scroll you over
to the other side of the map because your pointer was outside of the
window.

It is this kind of situation that is probably the main issue.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-22 Thread Peter Hutterer
On Tue, Apr 21, 2015 at 06:05:02PM -0700, Bill Spitzak wrote:
 Interesting. It does seem like a good idea to do remote by providing
 identical device api's. This probably applies to sound output too. There
 will have to be simple and obvious methods to figure out the remote machine
 so that all other devices besides the display go to the same one, and there
 will have to be network apis designed for each of them. But this may be a
 way to avoid having every aspect of remote hardware encoded into wayland
 messages.
 
 If a client opens a device, will that interfere with wayland's reading of
 the device? For instance if the client opens the mouse, will wayland still
 get the mouse position such that it can revoke access when the cursor moves
 out of the window?

unless one client issues an EVIOCGRAB on the fd, any other client will
continue to see events (and the event stream is the same on all fds). We use
this for debugging already and it is why things like evemu-record work even
when your graphical session is running.

The real problem regarding the mouse position is that you now rely on the
client and the compositor to calculate the cursor position exactly the same
way. If not, you may end up leaving the window when the cursor as drawn by
the client is nowhere near an edge (think of in-game menus). 

Go back a few years, all the VMs had this issue when they were using
relative input - the solution then was to grab the xorg pointer (i.e.
pointer locking). You can't rely on the relative events alone here.

Cheers,
   Peter

 
 On 04/20/2015 06:14 PM, x414e54 wrote:
 
 2015/04/21 5:42 Bill Spitzak spit...@gmail.com
 mailto:spit...@gmail.com:
  
   On 04/18/2015 03:20 AM, Hans de Goede wrote:
  
   This has been discussed before, and as mentioned before I really
   believe that we should not define a joystick API at all,
   rather we should define an API which will allow an app to:
  
   1) List devices which it can get raw access to
   2) Request raw access, if this succeeds the app will be handled
   an evdev fd for the device
   3) Notify the app that the compositor is revoking access (e.g.
   because the app lost focus), when this happens the compositor
   will do a revoke ioctl on the evdev fd rendering it useless
   to the app, so the app cannot keep using this (so no security
   issue).
   4) Hand a new evdev fd to the app when it regains access.
 
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-21 Thread x414e54
On Wed, Apr 22, 2015 at 10:05 AM, Bill Spitzak spit...@gmail.com wrote:
 If a client opens a device, will that interfere with wayland's reading of
 the device? For instance if the client opens the mouse, will wayland still
 get the mouse position such that it can revoke access when the cursor moves
 out of the window?

Multiple read access to the same device should work fine.

I think it is the writing that is the main issue. If the hardware has
a setting that either the compositor is not aware of (so cannot reset)
and could confuse the compositor's handling of the device in some way.

But this applies to all devices even on non Wayland systems so
probably should be done at a separate level.

I guess you could advise that all drivers support context switching,
where the entire device state is stored in the context which the
compositor can just switch to/from without caring what it is actually
changing. Then it is just up to the driver vendor or evdev to make
their implementation secure.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-21 Thread Peter Hutterer
On Sun, Apr 19, 2015 at 12:45:09PM +0100, Steven Newbury wrote:
 On Sun, 2015-04-19 at 15:29 +0900, x414e54 wrote:
  
  
  The way todo this seems to be for the compositor and client to 
  negotiate an event type they both can understand such as
  libinput_event or hid events and then a way to request a revokable 
  fd to the evdev directly so it can control LEDS and force feedback 
  etc. This allows for applications and compositors to grow separately 
  of the wayland protocol so it does not need updating every time 
  someone invents some new mouse device which needs 128bit integers 
  instead of doubles, has a z axis, thumbstick or tiny projector built 
  in, etc.
  
 
 There's also the point that nothing stops games or sdl-like layers 
 from using libinput to interpret the evdev stream, there's no need to 
 keep re-implementing device handlers for each client, that way new 
 devices supported by Wayland are automaticaly supported.

there is a minor problem: libinput can't open a device based on the fd
alone. there's a clunky workaround in the xorg libinput driver (see the
open_restricted implementation there). 

I'm willing to consider an API that passes an fd to libinput instead of a
path though. Pls file a bug whenever we're at the point where we really need
it, iirc I even had that implemented or at least partially implemented at
some point.

Cheers,
   Peter


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-21 Thread Bill Spitzak
Interesting. It does seem like a good idea to do remote by providing 
identical device api's. This probably applies to sound output too. There 
will have to be simple and obvious methods to figure out the remote 
machine so that all other devices besides the display go to the same 
one, and there will have to be network apis designed for each of them. 
But this may be a way to avoid having every aspect of remote hardware 
encoded into wayland messages.


If a client opens a device, will that interfere with wayland's reading 
of the device? For instance if the client opens the mouse, will wayland 
still get the mouse position such that it can revoke access when the 
cursor moves out of the window?


On 04/20/2015 06:14 PM, x414e54 wrote:


2015/04/21 5:42 Bill Spitzak spit...@gmail.com
mailto:spit...@gmail.com:
 
  On 04/18/2015 03:20 AM, Hans de Goede wrote:
 
  This has been discussed before, and as mentioned before I really
  believe that we should not define a joystick API at all,
  rather we should define an API which will allow an app to:
 
  1) List devices which it can get raw access to
  2) Request raw access, if this succeeds the app will be handled
  an evdev fd for the device
  3) Notify the app that the compositor is revoking access (e.g.
  because the app lost focus), when this happens the compositor
  will do a revoke ioctl on the evdev fd rendering it useless
  to the app, so the app cannot keep using this (so no security
  issue).
  4) Hand a new evdev fd to the app when it regains access.


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Bill Spitzak

On 04/18/2015 03:20 AM, Hans de Goede wrote:


This has been discussed before, and as mentioned before I really
believe that we should not define a joystick API at all,
rather we should define an API which will allow an app to:

1) List devices which it can get raw access to
2) Request raw access, if this succeeds the app will be handled
an evdev fd for the device
3) Notify the app that the compositor is revoking access (e.g.
because the app lost focus), when this happens the compositor
will do a revoke ioctl on the evdev fd rendering it useless
to the app, so the app cannot keep using this (so no security
issue).
4) Hand a new evdev fd to the app when it regains access.


This makes sense though I can see some problems if you try to run a game 
on a remote machine, since there is no fd that will work (unless there 
is a network protocol added to communicate all devices, but if that is 
the case why not reuse Wayland's protocol?).


I don't know what other platforms do, but perhaps it is acceptable if 
the number of raw devices can be empty. A client can then either fail 
gracefully, or try to do the best it can with the wl_pointer api. 
Anybody have any examples, is there rdp access to fancy input devices on 
Windows? So when you run your game remotely you are limited to the mouse 
emulation no matter how fancy your controller is.


It is likely that the method used to transform a device into wl_pointer 
positions is controlled by user preferences, so there will need to be a 
design such that the client can see these user preferences and replicate 
them when using the raw device.


For #3 can the compositor somehow force the opened fd to close?
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread x414e54
2015/04/21 5:42 Bill Spitzak spit...@gmail.com:

 On 04/18/2015 03:20 AM, Hans de Goede wrote:

 This has been discussed before, and as mentioned before I really
 believe that we should not define a joystick API at all,
 rather we should define an API which will allow an app to:

 1) List devices which it can get raw access to
 2) Request raw access, if this succeeds the app will be handled
 an evdev fd for the device
 3) Notify the app that the compositor is revoking access (e.g.
 because the app lost focus), when this happens the compositor
 will do a revoke ioctl on the evdev fd rendering it useless
 to the app, so the app cannot keep using this (so no security
 issue).
 4) Hand a new evdev fd to the app when it regains access.


 This makes sense though I can see some problems if you try to run a game
on a remote machine, since there is no fd that will work (unless there is a
network protocol added to communicate all devices, but if that is the case
why not reuse Wayland's protocol?).


You could probably use a fd to a socket to the device on the rdp client
instead or have the remote desktop server create evdev devices.

Personally I do prefer using events as it is easy for a client to use it
and also easy for the compositor to re-route or alter some data. But it
makes two way communication more difficult.

However not convinced about the current way of putting actual types for raw
data in the main protocol as it is difficult to change later.

Currently libinput uses doubles normalized to 1000dpi then Wayland uses
fixed_t then SDL uses int32. That is a lot of information loss. My current
mouse is 8200dpi (variable) with a 1000hz polling rate. It seems to make
more sense for the raw data just to be an integer dots delta as it is on
other platforms, then the game deals with this via a sensitivity setting.
You could transmit information about a default sensitivity but every game
is unique, that would probably be better via an xdg config file.

Trackpads are different but if used for mouse look they should be able to
be used like a thumbstick where center is home/origin. Rather than pushing
relative movement values. Similar to the Steam controller.

 I don't know what other platforms do, but perhaps it is acceptable if the
number of raw devices can be empty. A client can then either fail
gracefully, or try to do the best it can with the wl_pointer api. Anybody
have any examples, is there rdp access to fancy input devices on Windows?
So when you run your game remotely you are limited to the mouse emulation
no matter how fancy your controller is.


I think the assumption would be if you do not get a raw mouse then there
are no devices which can be used in relative mode and you should not try.

If you get no raw devices at all then you should either quit or pop up a
message telling the user to connect a device or the compositor would do
this. Raw devices could also include keyboards as there are plenty these
days which have fullcolor LEDS.

In windows or mac without raw/hid input device you would warp the cursor to
center after every move event and just hope it was a mouse. It seems much
better to just fail gracefully than resport to hacks when a user may only
have a gamepad plugged in.

As for RDP I think it supports (same as VMs) USB redirection. So the device
just shows up as a normal USB device to the remote windows server.

 It is likely that the method used to transform a device into wl_pointer
positions is controlled by user preferences, so there will need to be a
design such that the client can see these user preferences and replicate
them when using the raw device.


Yes I would definately like this, but it could be an xdg game config file
rather than by the Wayland protocol. Because with raw events you may still
want to use it for something different. Wayland just needs to provide a way
to get the raw evdev data.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread x414e54
 There is no sense in saying the sensor reading itself as absolute or
 relative. Either gives you some number in unknown units which you
 calibrate to get usable results. You have no idea where the stick is
 from the numbers you get. And there is absolutely no point caring. It
 may have some sense for a particular application and no sense for
 other.

 One of my original points was that a user should be able to hot-swap a
 mouse and a gamepad thumbstick without a game caring and that games do
 not care about mice/joystick/touchpad they just want raw axis values
 that they can use, evdev makes this abstraction.

 But you certainly need to know if the axis is relative or absolute to
 convert it to what the application needs.


If you had an application wanting to move an object around and I gave
you the value 500 and then 10 seconds later 400.
Has the object moved 900 units or -100 units? You need to know this,
this is the difference between absolute and relative.

But there is nothing stopping me giving you the position 500 and then
measuring the next value relatively -100 and then calculating the last
position plus the relative distance and giving this value to you.
Hence the hot-swapping.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread x414e54
This is kinda completely derailed from the whole include mice in the
game controller protocol talk.

On Mon, Apr 20, 2015 at 6:44 PM, Michal Suchanek hramr...@gmail.com wrote:
 On 20 April 2015 at 10:48, Pekka Paalanen ppaala...@gmail.com wrote:
 On Mon, 20 Apr 2015 10:13:34 +0200
 Michal Suchanek hramr...@gmail.com wrote:

 On 20 April 2015 at 09:36, Pekka Paalanen ppaala...@gmail.com wrote:
  On Sun, 19 Apr 2015 09:46:39 +0200
  Michal Suchanek hramr...@gmail.com wrote:
 
  So the device is always absolute and interpretation varies.
 
  I disagree.
 
  Let's take a mouse, optical or ball, doesn't matter. What you get out
  is a position delta over time. This is also know as velocity. Sampling
  rate affects the scale of the values, and you cannot reasonably define
  a closed range for the possible values. There is no home position. All

 There is a home position. That is when you do not move the mouse. The
 reading is then 0.

 That is not a unique position, hence it cannot be a home position. That
 is only a unique velocity. By definition, if your measurement is a
 velocity, it does not directly give you an absolute position.

 When we talk about absolute, we really mean absolute position.

 And what does absolute position of a sensor somewhere outside of the
 PC give you?

 A trackball and touchpad has as absolute position as joystick.

 Trackball measures velocity, touchpad finger position(s), joystick
 stick position.

 None of these is almost ever used for absolute input mapping
 particular reading of a sensor to a particular screen coordinate.


  A mouse could be an absolute device only if you were never able to lift
  it off the table and move it without it generating motion events. This
  is something you cannot do with an absolute device like a joystick.

 You are too much fixed on the construction of the sensor. Mouse is a
 velocity sensor similar to some nunchuck or whatever device with
 reasonable precision accelerometer. That you can and do lift it off
 the table is only relevant to how you use such sensor in practice.

 Accelerometers measure acceleration. Acceleration, like velocity, is
 not a position. It does not give you an absolute position directly.

 And what is practical impact of accelerometers not giving an absolute
 position compared to joystick?

You can warp a relative motion cursor but cannot warp an absolute
position cursor.

Warping a relative motion cursor is still a UX pain because you may be
at the edge of your physical reach but warping an absolute position
cursor is actually an offset and may make the interface unusable.


 Joystick can stay in an extreme position, mouse cannot. But if you
 take a nunchuck attached to a string and rotate it above your head the
 reading stays in an extreme position all the same.

 There is no sense in saying the sensor reading itself as absolute or
 relative. Either gives you some number in unknown units which you
 calibrate to get usable results. You have no idea where the stick is
 from the numbers you get. And there is absolutely no point caring. It
 may have some sense for a particular application and no sense for
 other.

One of my original points was that a user should be able to hot-swap a
mouse and a gamepad thumbstick without a game caring and that games do
not care about mice/joystick/touchpad they just want raw axis values
that they can use, evdev makes this abstraction.

But you certainly need to know if the axis is relative or absolute to
convert it to what the application needs.


 Thanks

 Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Michal Suchanek
On 20 April 2015 at 13:44, x414e54 x414...@linux.com wrote:
 This is kinda completely derailed from the whole include mice in the
 game controller protocol talk.

 On Mon, Apr 20, 2015 at 6:44 PM, Michal Suchanek hramr...@gmail.com wrote:
 On 20 April 2015 at 10:48, Pekka Paalanen ppaala...@gmail.com wrote:
 On Mon, 20 Apr 2015 10:13:34 +0200
 Michal Suchanek hramr...@gmail.com wrote:

 On 20 April 2015 at 09:36, Pekka Paalanen ppaala...@gmail.com wrote:
  On Sun, 19 Apr 2015 09:46:39 +0200
  Michal Suchanek hramr...@gmail.com wrote:
 
  So the device is always absolute and interpretation varies.
 
  I disagree.
 
  Let's take a mouse, optical or ball, doesn't matter. What you get out
  is a position delta over time. This is also know as velocity. Sampling
  rate affects the scale of the values, and you cannot reasonably define
  a closed range for the possible values. There is no home position. All

 There is a home position. That is when you do not move the mouse. The
 reading is then 0.

 That is not a unique position, hence it cannot be a home position. That
 is only a unique velocity. By definition, if your measurement is a
 velocity, it does not directly give you an absolute position.

 When we talk about absolute, we really mean absolute position.

 And what does absolute position of a sensor somewhere outside of the
 PC give you?

 A trackball and touchpad has as absolute position as joystick.

 Trackball measures velocity, touchpad finger position(s), joystick
 stick position.

 None of these is almost ever used for absolute input mapping
 particular reading of a sensor to a particular screen coordinate.


  A mouse could be an absolute device only if you were never able to lift
  it off the table and move it without it generating motion events. This
  is something you cannot do with an absolute device like a joystick.

 You are too much fixed on the construction of the sensor. Mouse is a
 velocity sensor similar to some nunchuck or whatever device with
 reasonable precision accelerometer. That you can and do lift it off
 the table is only relevant to how you use such sensor in practice.

 Accelerometers measure acceleration. Acceleration, like velocity, is
 not a position. It does not give you an absolute position directly.

 And what is practical impact of accelerometers not giving an absolute
 position compared to joystick?

 You can warp a relative motion cursor but cannot warp an absolute
 position cursor.

Indeed. But that's property of how the sensor data is used by the
compositor to move the cursor and not of the sensor.

If joystick was ever used to position the cursor it would most likely
be done in relative mode although you repeat that joystick is
'absolute'. There is no practical mapping of raw stick excentricity to
absolute screen coordinates.


 Warping a relative motion cursor is still a UX pain because you may be
 at the edge of your physical reach but warping an absolute position
 cursor is actually an offset and may make the interface unusable.

Warping a cursor that is operated using an input device in absolute
mode is completely possible. However, unless the cursor is also
confined it will likely warp back on the next input event. When the
cursor is confined you effectively get the situation that you have a
sensor reading that puts the cursor through mapping to absolute screen
coordinates outside of (active) screen area. You can also implement
the confinement by changing the mapping - not necessarily only the
offset.



 Joystick can stay in an extreme position, mouse cannot. But if you
 take a nunchuck attached to a string and rotate it above your head the
 reading stays in an extreme position all the same.

 There is no sense in saying the sensor reading itself as absolute or
 relative. Either gives you some number in unknown units which you
 calibrate to get usable results. You have no idea where the stick is
 from the numbers you get. And there is absolutely no point caring. It
 may have some sense for a particular application and no sense for
 other.

 One of my original points was that a user should be able to hot-swap a
 mouse and a gamepad thumbstick without a game caring and that games do
 not care about mice/joystick/touchpad they just want raw axis values
 that they can use, evdev makes this abstraction.

 But you certainly need to know if the axis is relative or absolute to
 convert it to what the application needs.

And my point is that there is no such thing as relative and absolute
axis. There are sensors that give numbers as readings. Sometimes you
know that a bunch of sensors are actually axis which are physically
connected and orthogonal on a device which is nice.

It might be worthwhile to provide adapting filters that try to mimic
the dynamic input interface properties of one type of device using
another type of device. However, this is nxn filters for n kinds of
devices. Certainly more than 2.

Thanks

Michal

Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Michal Suchanek
On 20 April 2015 at 14:49, x414e54 x414...@linux.com wrote:
 There is no sense in saying the sensor reading itself as absolute or
 relative. Either gives you some number in unknown units which you
 calibrate to get usable results. You have no idea where the stick is
 from the numbers you get. And there is absolutely no point caring. It
 may have some sense for a particular application and no sense for
 other.

 One of my original points was that a user should be able to hot-swap a
 mouse and a gamepad thumbstick without a game caring and that games do
 not care about mice/joystick/touchpad they just want raw axis values
 that they can use, evdev makes this abstraction.

 But you certainly need to know if the axis is relative or absolute to
 convert it to what the application needs.


 If you had an application wanting to move an object around and I gave
 you the value 500 and then 10 seconds later 400.
 Has the object moved 900 units or -100 units? You need to know this,
 this is the difference between absolute and relative.

Actually, that's determined by the application. It can use any sensor
in relative or absolute mapping as it sees fit. And since we are
talking about replacing a controller like joystick which is typically
used in relative mode with a mouse which is typically also used in
relative mode or a tablet or touch layer which can also be used in
relative mode (eg. when scrolling) there is no real problem to solve
here. If you were really diligent you could adapt the touch layer by
adding an offset so the center of the touch area reads as 0.


 But there is nothing stopping me giving you the position 500 and then
 measuring the next value relatively -100 and then calculating the last
 position plus the relative distance and giving this value to you.
 Hence the hot-swapping.

There is no relative distance anywhere, ever. Unless you make one up.

The application is not interested in something you make up but in
sensor readings.

If it is interested in data you make up it can use the pointer position.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Pekka Paalanen
On Mon, 20 Apr 2015 10:13:34 +0200
Michal Suchanek hramr...@gmail.com wrote:

 On 20 April 2015 at 09:36, Pekka Paalanen ppaala...@gmail.com wrote:
  On Sun, 19 Apr 2015 09:46:39 +0200
  Michal Suchanek hramr...@gmail.com wrote:
 
  So the device is always absolute and interpretation varies.
 
  I disagree.
 
  Let's take a mouse, optical or ball, doesn't matter. What you get out
  is a position delta over time. This is also know as velocity. Sampling
  rate affects the scale of the values, and you cannot reasonably define
  a closed range for the possible values. There is no home position. All
 
 There is a home position. That is when you do not move the mouse. The
 reading is then 0.

That is not a unique position, hence it cannot be a home position. That
is only a unique velocity. By definition, if your measurement is a
velocity, it does not directly give you an absolute position.

When we talk about absolute, we really mean absolute position.

  A mouse could be an absolute device only if you were never able to lift
  it off the table and move it without it generating motion events. This
  is something you cannot do with an absolute device like a joystick.
 
 You are too much fixed on the construction of the sensor. Mouse is a
 velocity sensor similar to some nunchuck or whatever device with
 reasonable precision accelerometer. That you can and do lift it off
 the table is only relevant to how you use such sensor in practice.

Accelerometers measure acceleration. Acceleration, like velocity, is
not a position. It does not give you an absolute position directly.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Pekka Paalanen
  On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:
 
  USB HID specifications define a pointer and a mouse as two completely
  different inputs. A mouse can be a used as a pointer because it is
  pushing the cursor around but the pointer points at a specific
  location.

Okay. Using different definitions for terms from different places and
interpreting the terms used by other people with your own different
definitions is obviously going to cause disagreement.

I explained what a wl_pointer in Wayland terms is in another email.
Sounds like it is specifically not a HID pointer device.


On Sun, 19 Apr 2015 09:46:39 +0200
Michal Suchanek hramr...@gmail.com wrote:

 So the device is always absolute and interpretation varies.

I disagree.

Let's take a mouse, optical or ball, doesn't matter. What you get out
is a position delta over time. This is also know as velocity. Sampling
rate affects the scale of the values, and you cannot reasonably define
a closed range for the possible values. There is no home position. All
this reads to me as relative. The home position is the important
thing, and that the home position is observable by the human user.

Take a joystick. The stick has a home position, the center. You can
only tilt the stick up to it's hardware limits. Those limits are well
established and obvious to the human using the stick without knowing
anything else than just looking at the device. The measurements you get
tell you the position of the stick. Sampling rate does not affect the
readings, and they are not related to time. Therefore the readings are
not velocity but position. This is what I would call absolute.

Yes, the trackpoint has been raised here before, and it seems much
closer to a joystick than a traditional mouse. That's ok, you probably
could use it as a joystick, since it does have a home position that is
obvious to a human user. Like you said, for trackpoints the absolute
measurement is only interpreted as a velocity through some
non-decreasing function.

A mouse could be an absolute device only if you were never able to lift
it off the table and move it without it generating motion events. This
is something you cannot do with an absolute device like a joystick.

 You are trying to make a distinction that is only relevant to use of
 the device readings for generating pointer motion events but otherwise
 does not exist.

Converting one input device to emulate another (trackpoint - mouse,
touchpad - mouse, keyboard - mouse, mouse - keyboard, mouse -
joystick) is one thing. I don't think that is on topic here.

A mouse is inherently a relative input device. What we're discussing
here is exposing the relative measurements to apps, rather than the
absolute position that the compositor manufactures by integrating over
the relative measurements.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread x414e54
On Mon, Apr 20, 2015 at 4:36 PM, Pekka Paalanen ppaala...@gmail.com wrote:
  On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:

  USB HID specifications define a pointer and a mouse as two completely
  different inputs. A mouse can be a used as a pointer because it is
  pushing the cursor around but the pointer points at a specific
  location.

 Okay. Using different definitions for terms from different places and
 interpreting the terms used by other people with your own different
 definitions is obviously going to cause disagreement.

 I explained what a wl_pointer in Wayland terms is in another email.
 Sounds like it is specifically not a HID pointer device.


Yes this is fair enough.

I read though the wl_gamepad discussion from before and it mostly
seems to be along the same lines as I was thinking.

For example using a PS4 controller which has a built in
trackpad/touchpad. Unless the game specifically wants to use the
trackpad, I would want to be able to use it as the wl_pointer to
interact with other windows without the game loosing gamepad focus.

I was also thinking crazy things like two mice per seat for something
like multiplayer Surgeon Simulator with two hands. Each user needs to
assign one mouse to left and one to right.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Michal Suchanek
On 20 April 2015 at 10:48, Pekka Paalanen ppaala...@gmail.com wrote:
 On Mon, 20 Apr 2015 10:13:34 +0200
 Michal Suchanek hramr...@gmail.com wrote:

 On 20 April 2015 at 09:36, Pekka Paalanen ppaala...@gmail.com wrote:
  On Sun, 19 Apr 2015 09:46:39 +0200
  Michal Suchanek hramr...@gmail.com wrote:
 
  So the device is always absolute and interpretation varies.
 
  I disagree.
 
  Let's take a mouse, optical or ball, doesn't matter. What you get out
  is a position delta over time. This is also know as velocity. Sampling
  rate affects the scale of the values, and you cannot reasonably define
  a closed range for the possible values. There is no home position. All

 There is a home position. That is when you do not move the mouse. The
 reading is then 0.

 That is not a unique position, hence it cannot be a home position. That
 is only a unique velocity. By definition, if your measurement is a
 velocity, it does not directly give you an absolute position.

 When we talk about absolute, we really mean absolute position.

And what does absolute position of a sensor somewhere outside of the
PC give you?

A trackball and touchpad has as absolute position as joystick.

Trackball measures velocity, touchpad finger position(s), joystick
stick position.

None of these is almost ever used for absolute input mapping
particular reading of a sensor to a particular screen coordinate.


  A mouse could be an absolute device only if you were never able to lift
  it off the table and move it without it generating motion events. This
  is something you cannot do with an absolute device like a joystick.

 You are too much fixed on the construction of the sensor. Mouse is a
 velocity sensor similar to some nunchuck or whatever device with
 reasonable precision accelerometer. That you can and do lift it off
 the table is only relevant to how you use such sensor in practice.

 Accelerometers measure acceleration. Acceleration, like velocity, is
 not a position. It does not give you an absolute position directly.

And what is practical impact of accelerometers not giving an absolute
position compared to joystick?

Joystick can stay in an extreme position, mouse cannot. But if you
take a nunchuck attached to a string and rotate it above your head the
reading stays in an extreme position all the same.

There is no sense in saying the sensor reading itself as absolute or
relative. Either gives you some number in unknown units which you
calibrate to get usable results. You have no idea where the stick is
from the numbers you get. And there is absolutely no point caring. It
may have some sense for a particular application and no sense for
other.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-20 Thread Michal Suchanek
On 20 April 2015 at 09:36, Pekka Paalanen ppaala...@gmail.com wrote:
  On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:

  USB HID specifications define a pointer and a mouse as two completely
  different inputs. A mouse can be a used as a pointer because it is
  pushing the cursor around but the pointer points at a specific
  location.

 Okay. Using different definitions for terms from different places and
 interpreting the terms used by other people with your own different
 definitions is obviously going to cause disagreement.

 I explained what a wl_pointer in Wayland terms is in another email.
 Sounds like it is specifically not a HID pointer device.


 On Sun, 19 Apr 2015 09:46:39 +0200
 Michal Suchanek hramr...@gmail.com wrote:

 So the device is always absolute and interpretation varies.

 I disagree.

 Let's take a mouse, optical or ball, doesn't matter. What you get out
 is a position delta over time. This is also know as velocity. Sampling
 rate affects the scale of the values, and you cannot reasonably define
 a closed range for the possible values. There is no home position. All

There is a home position. That is when you do not move the mouse. The
reading is then 0.

And there is range. The construction of the mouse sensor defines
maximum speed measurable in hardware. Although you do not get this
speed in mouse specifications on many low end mice this threshold is
actually reachable.

 this reads to me as relative. The home position is the important
 thing, and that the home position is observable by the human user.

Indeed, and both a joystick and a mouse have a home position.


 Take a joystick. The stick has a home position, the center. You can
 only tilt the stick up to it's hardware limits. Those limits are well
 established and obvious to the human using the stick without knowing
 anything else than just looking at the device. The measurements you get
 tell you the position of the stick. Sampling rate does not affect the
 readings, and they are not related to time. Therefore the readings are
 not velocity but position. This is what I would call absolute.

Sampling rate does not actually affect measured speed either so long
as what you measure is speed. It affects measured distance in the
sampling period so you have to take sampling period into account when
determining speed. And since the sampling period is typically fixed
for a mouse what you get is a sensor reading which is absolutely
comparable with any other reading from the same sensor. It's the
distance the mouse moved in the sampling interval or the mouse
movement speed in some unspecified units.


 Yes, the trackpoint has been raised here before, and it seems much
 closer to a joystick than a traditional mouse. That's ok, you probably
 could use it as a joystick, since it does have a home position that is
 obvious to a human user. Like you said, for trackpoints the absolute
 measurement is only interpreted as a velocity through some
 non-decreasing function.

The practical difference between mouse and joystick is that you can
move the stick to an extreme position and hold it in that position
which is not possible with a mouse. That's why trackpoint is a
joystick unless the hardware cooks the stick data in a very weird way.


 A mouse could be an absolute device only if you were never able to lift
 it off the table and move it without it generating motion events. This
 is something you cannot do with an absolute device like a joystick.

You are too much fixed on the construction of the sensor. Mouse is a
velocity sensor similar to some nunchuck or whatever device with
reasonable precision accelerometer. That you can and do lift it off
the table is only relevant to how you use such sensor in practice.

Or is by your definition of relative a trackball bolted to the table
an absolute input device because you cannot lift it?


 You are trying to make a distinction that is only relevant to use of
 the device readings for generating pointer motion events but otherwise
 does not exist.

 Converting one input device to emulate another (trackpoint - mouse,
 touchpad - mouse, keyboard - mouse, mouse - keyboard, mouse -
 joystick) is one thing. I don't think that is on topic here.

 A mouse is inherently a relative input device. What we're discussing
 here is exposing the relative measurements to apps, rather than the
 absolute position that the compositor manufactures by integrating over
 the relative measurements.

But that's confusing things. Mouse is as absolute as joystick. The
compositor input is all about converting absolute sensor data into
relative pointer movement because the sensor range cannot be
practically mapped 1-to-1 to absolute screen coordinates for most
sensors.

What the programs that eschew this conversion want is access to raw
unconverted sensor readings as much as possible so that they can
convert it to other input such as scene rotation. And they want to
convert the sensor reading to relative or absolute 

Re: Wayland Relative Pointer API Progress

2015-04-19 Thread x414e54
From the top just, to get our way through this treacle, what a game wants is:

1. Enumerate input devices.
2. Select the HID type each player wants and get a device that
supports that usage.

e.g. Mouse type devices - (could be thumb-stick, pointer stick, mouse,
accelerometer, trackpad or pen as long as it is relative motion across
two axis and has at least two buttons), gamepads, HMDs, 6DOF pointers
etc.

3. For each player assign the input device, either based on one input
per wl_seat or the game assigning the enumerated devices from one
wl_seat.
4. If the device controls the system cursor (wl_pointer) hide, lock,
detach or constrain.
5. Receive raw unaccelerated data from those devices (including mouse wheels).
6. Be able to open the device directly and communicate back to that
same device to provide force feedback and changing LED colors etc. per
player.


If the game wants to use any available trackpad as an absolute pointer
device it should be able to but it should also be able to use it as a
normal mouse input separate of the wl_pointer.


The way todo this seems to be for the compositor and client to
negotiate an event type they both can understand such as
libinput_event or hid events and then a way to request a revokable fd
to the evdev directly so it can control LEDS and force feedback etc.
This allows for applications and compositors to grow separately of the
wayland protocol so it does not need updating every time someone
invents some new mouse device which needs 128bit integers instead of
doubles, has a z axis, thumbstick or tiny projector built in, etc.


This is all completely different from what a GUI input wants todo
which is temporarily receive constant motion from the wl_pointer not
constrained by the window or output size based on dragging some widget
or slider.


Thank you for listening, I hope I have given you a good insight into
the way games work.
But I do not really have any more time to spend on this.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-19 Thread Steven Newbury
On Sun, 2015-04-19 at 15:29 +0900, x414e54 wrote:
 
 
 The way todo this seems to be for the compositor and client to 
 negotiate an event type they both can understand such as
 libinput_event or hid events and then a way to request a revokable 
 fd to the evdev directly so it can control LEDS and force feedback 
 etc. This allows for applications and compositors to grow separately 
 of the wayland protocol so it does not need updating every time 
 someone invents some new mouse device which needs 128bit integers 
 instead of doubles, has a z axis, thumbstick or tiny projector built 
 in, etc.
 

There's also the point that nothing stops games or sdl-like layers 
from using libinput to interpret the evdev stream, there's no need to 
keep re-implementing device handlers for each client, that way new 
devices supported by Wayland are automaticaly supported.


signature.asc
Description: This is a digitally signed message part
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-19 Thread Michal Suchanek
On 19 April 2015 at 06:15, x414e54 x414...@linux.com wrote:
 On Sun, Apr 19, 2015 at 12:45 AM, Michal Suchanek hramr...@gmail.com wrote:
 On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:



 A joystick does not necessarily have 2 axis and in most cases yes they
 are reporting an absolute position of the axis in the driver but it
 does not necessarily mean that the the hardware is absolute. For
 example if a joystick is using the same slotted optical system as the
 ball mice then this would be measuring relative motion and using it to
 calculate the absolute axis value, under some circumstances the
 position could become out of sync until re-calibrated by moving to the
 joystick to the maximum and minimum values for all axes or having it
 automatically re-center.


 USB HID specifications define a pointer and a mouse as two completely
 different inputs. A mouse can be a used as a pointer because it is
 pushing the cursor around but the pointer points at a specific
 location.

 And there is no practical way to point with a mouse to a specific
 location. Nor is there for most joysticks because they are not precise
 enough but you technically could map the stick excentricity to screen
 coordinates. Similarly a small touchpad has no practical mapping of
 touch coordinates to screen coordinates but for a big graphics tablet
 or touchscreen surface this can be done.

 The device is never relative, only interpretation of the obtained data is.

 Thanks

 Michal

 Rather than waste time on this I will just direct you over the the
 universal teacher Google.

 relative device is probably the best search term.

yes, and all the articles I found

like https://en.wikipedia.org/wiki/Input_device
https://msdn.microsoft.com/en-us/library/windows/desktop/ee418779%28v=vs.85%29.aspx

boil down to the fact that the device reports some value which can be
interpreted as relative pointer position increment or reported as
absolute value read from the sensor.

So the device is always absolute and interpretation varies.

You are trying to make a distinction that is only relevant to use of
the device readings for generating pointer motion events but otherwise
does not exist.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread Michal Suchanek
On 17 April 2015 at 12:52, Hans de Goede hdego...@redhat.com wrote:
 Hi,


 On 17-04-15 11:47, Michal Suchanek wrote:

 On 17 April 2015 at 09:11, Pekka Paalanen ppaala...@gmail.com wrote:

 On Fri, 17 Apr 2015 13:43:11 +0900
 x414e54 x414...@linux.com wrote:

 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.


 What definition of a pointer are you using?

 The definition Wayland uses for a wl_pointer is a device that:
 - requires a cursor image on screen to be usable
 - the physical input is relative, not absolute

 This definition is inspired by mice, and mice have been called pointer
 devices, so we picked the well-known name pointer for mice-like
 devices.

 Specifically, a pointer is *not* a device where you directly point a
 location on screen, like a touchscreen for example. For touchscreens,
 there is a separate protocol wl_touch.

 For drawing tablets, there will be yet another procotol.

 Joysticks or gamepads fit into none of the above. For the rest of the
 conversation, you should probably look up the long gamepad protocol
 discussions from the wayland-devel mailing list archives.


 And how is a joystick different from a trackpoint, exactly?

 It uses different hardware interface and later different software
 interface but for no good reason. It's just 2 axis relative input
 device with buttons. Sure, the big joystick, gamepad directional cap
 and trackpoint are at a different place of the stick size scale and
 might have different hardware sensors which should be reflected with
 different acceleration settings but ultimately it's the same kind of
 device.


 Actually joystick analog inputs are absolute not relative. They give a value
 for exactly how much the stick has moved from the center.

 Except for dpads which are really buttons not relative axis, so joysticks
 really are pretty much not like trackpoints in anyway.


Hi,

then actually mice are absolute not relative. They have two axis that
measure absolute ball rotation speed in two directions just like
joystick has two axis that measure absolute stick excentricity.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread x414e54
 Right.

 This has been discussed before, and as mentioned before I really
 believe that we should not define a joystick API at all,
 rather we should define an API which will allow an app to:

 1) List devices which it can get raw access to
 2) Request raw access, if this succeeds the app will be handled
 an evdev fd for the device
 3) Notify the app that the compositor is revoking access (e.g.
 because the app lost focus), when this happens the compositor
 will do a revoke ioctl on the evdev fd rendering it useless
 to the app, so the app cannot keep using this (so no security
 issue).
 4) Hand a new evdev fd to the app when it regains access.

 This will allow whatever gaming crazy devices pop up to work,
 and will allow force feedback to work too.

 I think that trying to define a wayland controller protocol for
 this is not a good idea, we will just end up wrapping evdev and
 adding a whole lot of unnecessary indirection. Currently
 games (e.g. libSDL) already are used to opening raw evdev nodes,
 by moving the enumeration + actual opening into the compositor
 we are fixing the security concerns of the old approach while
 keeping all the good bits of it (mainly giving games raw
 access to complex input devices), and this also solves the
 seat assignment problem neatly.

 Regards,

 Hans

Yes I agree this is the best option and it should be used to implement
mouse support in games. They get the fd and can get all of the
unaccelerated raw relative information they need. There are also other
cases like mice which have RGB LEDS and the game will want to change
the color of these. This also allows support for multiple pointers in
games which do not all drive the system pointer.

The pointer focus still needs to be constrained or the device unhooked
from the system pointer but only in the case that the device opened is
the same as the system pointer one and the application asks for it to
be hidden. I think in most cases in a game if the pointer is visible
it should be allowed to escape the window.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread Michal Suchanek
On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:
 Hi,

 then actually mice are absolute not relative. They have two axis that
 measure absolute ball rotation speed in two directions just like
 joystick has two axis that measure absolute stick excentricity.

 Thanks

 Michal

 This is not really constructive to the api but:

 Mice are not absolute because they are just measuring movement of a
 surface relative to itself, when you are not moving the mouse there is
 no axis value.

There is: the value is 0. Which is the same as with properly
calibrated joystick - when you release it it returns to the position
where reading of both axis is 0.

And that is the interface expected when using a grabbed mouse with
hidden cursor most of the time.

There is even parallel for removing springs from a joystick in the
mouse world - there were Russian trackballs with really big metal ball
which would keep spinning due to its momentum until stopped.

You could take the ball out and turn it over and put it
 back in the absolute position of the ball has changed but the mouse
 axis has not. For some ball mice the rollers measure the movement of a

That's because it measures speed of the ball not its position. When
you roll the ball outside of the mouse it cannot measure it.

 wheel with small holes inside it, when it moves it breaks the
 connection the chip registers this and uses it to calculate the delta
 for that axis. Optical mice are just taking small images of the
 surface and using that information to calculate a distance moved,
 again relative motion.

However, the reported value is absolute speed of the mouse as measured
against the surface. There is no more relativity than with measurement
of excentricity of a stick relative to a central position.


 A joystick does not necessarily have 2 axis and in most cases yes they
 are reporting an absolute position of the axis in the driver but it
 does not necessarily mean that the the hardware is absolute. For
 example if a joystick is using the same slotted optical system as the
 ball mice then this would be measuring relative motion and using it to
 calculate the absolute axis value, under some circumstances the
 position could become out of sync until re-calibrated by moving to the
 joystick to the maximum and minimum values for all axes or having it
 automatically re-center.


 USB HID specifications define a pointer and a mouse as two completely
 different inputs. A mouse can be a used as a pointer because it is
 pushing the cursor around but the pointer points at a specific
 location.

And there is no practical way to point with a mouse to a specific
location. Nor is there for most joysticks because they are not precise
enough but you technically could map the stick excentricity to screen
coordinates. Similarly a small touchpad has no practical mapping of
touch coordinates to screen coordinates but for a big graphics tablet
or touchscreen surface this can be done.

The device is never relative, only interpretation of the obtained data is.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread x414e54
 Hi,

 then actually mice are absolute not relative. They have two axis that
 measure absolute ball rotation speed in two directions just like
 joystick has two axis that measure absolute stick excentricity.

 Thanks

 Michal

This is not really constructive to the api but:

Mice are not absolute because they are just measuring movement of a
surface relative to itself, when you are not moving the mouse there is
no axis value. You could take the ball out and turn it over and put it
back in the absolute position of the ball has changed but the mouse
axis has not. For some ball mice the rollers measure the movement of a
wheel with small holes inside it, when it moves it breaks the
connection the chip registers this and uses it to calculate the delta
for that axis. Optical mice are just taking small images of the
surface and using that information to calculate a distance moved,
again relative motion.

A joystick does not necessarily have 2 axis and in most cases yes they
are reporting an absolute position of the axis in the driver but it
does not necessarily mean that the the hardware is absolute. For
example if a joystick is using the same slotted optical system as the
ball mice then this would be measuring relative motion and using it to
calculate the absolute axis value, under some circumstances the
position could become out of sync until re-calibrated by moving to the
joystick to the maximum and minimum values for all axes or having it
automatically re-center.


USB HID specifications define a pointer and a mouse as two completely
different inputs. A mouse can be a used as a pointer because it is
pushing the cursor around but the pointer points at a specific
location.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread Hans de Goede

Hi,

On 18-04-15 16:03, x414e54 wrote:

Right.

This has been discussed before, and as mentioned before I really
believe that we should not define a joystick API at all,
rather we should define an API which will allow an app to:

1) List devices which it can get raw access to
2) Request raw access, if this succeeds the app will be handled
an evdev fd for the device
3) Notify the app that the compositor is revoking access (e.g.
because the app lost focus), when this happens the compositor
will do a revoke ioctl on the evdev fd rendering it useless
to the app, so the app cannot keep using this (so no security
issue).
4) Hand a new evdev fd to the app when it regains access.

This will allow whatever gaming crazy devices pop up to work,
and will allow force feedback to work too.

I think that trying to define a wayland controller protocol for
this is not a good idea, we will just end up wrapping evdev and
adding a whole lot of unnecessary indirection. Currently
games (e.g. libSDL) already are used to opening raw evdev nodes,
by moving the enumeration + actual opening into the compositor
we are fixing the security concerns of the old approach while
keeping all the good bits of it (mainly giving games raw
access to complex input devices), and this also solves the
seat assignment problem neatly.

Regards,

Hans


Yes I agree this is the best option and it should be used to implement
mouse support in games. They get the fd and can get all of the
unaccelerated raw relative information they need. There are also other
cases like mice which have RGB LEDS and the game will want to change
the color of these. This also allows support for multiple pointers in
games which do not all drive the system pointer.


Erm, I was specifically not talking about mice, I'm not so sure the above
is a good idea for mice actually, esp. since games which are playable by
mouse should most likely also be playable by trackpoint / touchpad on
laptops and we really do not want each game to re-implement touchpad
support. So no the API which I propose in very rough lines above is out
of the question for mice (and touchpads and trackpoints).

Regards,

Hans
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread x414e54
This is why my original suggestion was for a generic controller
protocol similar to DirectInput or pass libinput events directly
similar to HID RawInput rather than a fd.

Also even if you are passing a fd it does not mean it needs to
re-implement trackpad support, if they driver exports a mouse HID
usage page then you can pass this directly, or you could export a fd
to some abstracted device which transmits relative axis information
(which is what libinput is doing anyway). But my preferred idea would
be better to send HID compatible events with a device id so you can
open the device directly if you need to.

Passing around relative values in a wl_pointer and hoping the
unaccelerated data will fit into a wl_fixed_t or even that the device
supports relative mode, seems very restrictive for games. High dpi
mouse wheels also exist and currently the wl_pointer axis events will
not support this. Also there is no information about the wl_pointer
axes, are they normalised, relative, absolute and/or discrete. Force
feedback/haptic mice (and trackpads) also exist.

The input protocol should be generic allowing the implementation to be
able to grow around it otherwise you are limiting the growth of game
innovation on Linux by whatever a small group of people decide a
wl_pointer is.

On Sun, Apr 19, 2015 at 6:55 AM, Hans de Goede hdego...@redhat.com wrote:
 Hi,


 On 18-04-15 16:03, x414e54 wrote:

 Right.

 This has been discussed before, and as mentioned before I really
 believe that we should not define a joystick API at all,
 rather we should define an API which will allow an app to:

 1) List devices which it can get raw access to
 2) Request raw access, if this succeeds the app will be handled
 an evdev fd for the device
 3) Notify the app that the compositor is revoking access (e.g.
 because the app lost focus), when this happens the compositor
 will do a revoke ioctl on the evdev fd rendering it useless
 to the app, so the app cannot keep using this (so no security
 issue).
 4) Hand a new evdev fd to the app when it regains access.

 This will allow whatever gaming crazy devices pop up to work,
 and will allow force feedback to work too.

 I think that trying to define a wayland controller protocol for
 this is not a good idea, we will just end up wrapping evdev and
 adding a whole lot of unnecessary indirection. Currently
 games (e.g. libSDL) already are used to opening raw evdev nodes,
 by moving the enumeration + actual opening into the compositor
 we are fixing the security concerns of the old approach while
 keeping all the good bits of it (mainly giving games raw
 access to complex input devices), and this also solves the
 seat assignment problem neatly.

 Regards,

 Hans


 Yes I agree this is the best option and it should be used to implement
 mouse support in games. They get the fd and can get all of the
 unaccelerated raw relative information they need. There are also other
 cases like mice which have RGB LEDS and the game will want to change
 the color of these. This also allows support for multiple pointers in
 games which do not all drive the system pointer.


 Erm, I was specifically not talking about mice, I'm not so sure the above
 is a good idea for mice actually, esp. since games which are playable by
 mouse should most likely also be playable by trackpoint / touchpad on
 laptops and we really do not want each game to re-implement touchpad
 support. So no the API which I propose in very rough lines above is out
 of the question for mice (and touchpads and trackpoints).

 Regards,

 Hans
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread x414e54
On Sun, Apr 19, 2015 at 12:45 AM, Michal Suchanek hramr...@gmail.com wrote:
 On 18 April 2015 at 16:58, x414e54 x414...@linux.com wrote:
 Hi,

 then actually mice are absolute not relative. They have two axis that
 measure absolute ball rotation speed in two directions just like
 joystick has two axis that measure absolute stick excentricity.

 Thanks

 Michal

 This is not really constructive to the api but:

 Mice are not absolute because they are just measuring movement of a
 surface relative to itself, when you are not moving the mouse there is
 no axis value.

 There is: the value is 0. Which is the same as with properly
 calibrated joystick - when you release it it returns to the position
 where reading of both axis is 0.

 And that is the interface expected when using a grabbed mouse with
 hidden cursor most of the time.

 There is even parallel for removing springs from a joystick in the
 mouse world - there were Russian trackballs with really big metal ball
 which would keep spinning due to its momentum until stopped.

 You could take the ball out and turn it over and put it
 back in the absolute position of the ball has changed but the mouse
 axis has not. For some ball mice the rollers measure the movement of a

 That's because it measures speed of the ball not its position. When
 you roll the ball outside of the mouse it cannot measure it.

 wheel with small holes inside it, when it moves it breaks the
 connection the chip registers this and uses it to calculate the delta
 for that axis. Optical mice are just taking small images of the
 surface and using that information to calculate a distance moved,
 again relative motion.

 However, the reported value is absolute speed of the mouse as measured
 against the surface. There is no more relativity than with measurement
 of excentricity of a stick relative to a central position.


 A joystick does not necessarily have 2 axis and in most cases yes they
 are reporting an absolute position of the axis in the driver but it
 does not necessarily mean that the the hardware is absolute. For
 example if a joystick is using the same slotted optical system as the
 ball mice then this would be measuring relative motion and using it to
 calculate the absolute axis value, under some circumstances the
 position could become out of sync until re-calibrated by moving to the
 joystick to the maximum and minimum values for all axes or having it
 automatically re-center.


 USB HID specifications define a pointer and a mouse as two completely
 different inputs. A mouse can be a used as a pointer because it is
 pushing the cursor around but the pointer points at a specific
 location.

 And there is no practical way to point with a mouse to a specific
 location. Nor is there for most joysticks because they are not precise
 enough but you technically could map the stick excentricity to screen
 coordinates. Similarly a small touchpad has no practical mapping of
 touch coordinates to screen coordinates but for a big graphics tablet
 or touchscreen surface this can be done.

 The device is never relative, only interpretation of the obtained data is.

 Thanks

 Michal

Rather than waste time on this I will just direct you over the the
universal teacher Google.

relative device is probably the best search term.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-18 Thread Hans de Goede

Hi,

On 18-04-15 03:35, x414e54 wrote:

A big problem with just saying the game must use the joystick api is that
the game won't work on a machine without a joystick unless the joystick api
is emulated for the mouse. This seems to me to be exactly the same problem
and requiring exactly the same solutions, except you have moved the code
from the client to the compositor, which is usually a bad idea. Also you
have made it a pain in the ass to create simple toolkits since they now have
to provide the joystick api.


By joystick I mean generic axis device with buttons we do not mean
an actually joystick, controller api may be a better word. If my
gaming mouse has 40 buttons, and a 7000+ dpi laser 3D mouse wheel (not
just the movement sensor), I want ALL of that information not just
what the wayland protocol designers think is best.

Obviously a joystick is confusing and more restrictive than allowing
the full access to the device marshalled by the compositor and then
the user can run it though whichever api they want this is why I
prefer Jonas' idea of using the actual evdev device but the compositor
can MITM the char buffer or something similar.

This is EXACTLY how games work on other platforms when using mouse,
they either use a raw USB/HID input api or in the case of older games
older than windows xp they use directinput. Obviously on windows you
would use xinput (which will do more abstracting for you) and rawinput
but on mac you use HIDManager for everything which also supports raw
mouse input. The point is using a well abstracted api you can do not
care what you get as long as it has at least 2 relative axis and 2
buttons.

You want the compositor todo this because the game knows nothing about
which evdev input device is associated with which seat, how does it
know what to open? Also this would be good from a security side as you
do not have to allow full user access to the evdev device which
another user on the same system may want to use. You can have the
compositor refer to a security or sandboxing api.


Right.

This has been discussed before, and as mentioned before I really
believe that we should not define a joystick API at all,
rather we should define an API which will allow an app to:

1) List devices which it can get raw access to
2) Request raw access, if this succeeds the app will be handled
an evdev fd for the device
3) Notify the app that the compositor is revoking access (e.g.
because the app lost focus), when this happens the compositor
will do a revoke ioctl on the evdev fd rendering it useless
to the app, so the app cannot keep using this (so no security
issue).
4) Hand a new evdev fd to the app when it regains access.

This will allow whatever gaming crazy devices pop up to work,
and will allow force feedback to work too.

I think that trying to define a wayland controller protocol for
this is not a good idea, we will just end up wrapping evdev and
adding a whole lot of unnecessary indirection. Currently
games (e.g. libSDL) already are used to opening raw evdev nodes,
by moving the enumeration + actual opening into the compositor
we are fixing the security concerns of the old approach while
keeping all the good bits of it (mainly giving games raw
access to complex input devices), and this also solves the
seat assignment problem neatly.

Regards,

Hans
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Pekka Paalanen
On Fri, 17 Apr 2015 13:43:11 +0900
x414e54 x414...@linux.com wrote:

 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.
 
 
  Not sure an IR/laser/wii mote pointer should even be considered a
  relative pointer since they operate in absolute coordinates. Given
  this, there is no set position hint to consider. Transmitting
  acceleramoter data via a relative pointer doesn't sound reasonable.
 
 
 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.

What definition of a pointer are you using?

The definition Wayland uses for a wl_pointer is a device that:
- requires a cursor image on screen to be usable
- the physical input is relative, not absolute

This definition is inspired by mice, and mice have been called pointer
devices, so we picked the well-known name pointer for mice-like
devices.

Specifically, a pointer is *not* a device where you directly point a
location on screen, like a touchscreen for example. For touchscreens,
there is a separate protocol wl_touch.

For drawing tablets, there will be yet another procotol.

Joysticks or gamepads fit into none of the above. For the rest of the
conversation, you should probably look up the long gamepad protocol
discussions from the wayland-devel mailing list archives.

A fundamental difference between a wiimote and a pointer, as far as I
understand, is that wiimote might be off-screen while a pointer never
can. You also would not unfocus a wiimote from an app window just
because it went off-screen or off-window, right? Button events should
still be delivered to the app? A Pointer will unfocus, because without
grabs, the focus is expected to shift to whatever is under the pointer.

On Fri, 17 Apr 2015 14:21:58 +0900
x414e54 x414...@linux.com wrote:

 If you add in something like get a wl_input from a wl_seat which can
 be used as a generic interface to access the libinput directly in a
 safe way but still controlled the compositor if the window loses focus
 or there needs to be some translation done. This would be much more
 generic than my wl_jostick or wl_6dof proposal.

That is quite much what Jason meant by:

On Fri, 17 Apr 2015 11:30:16 +0800
Jonas Ådahl jad...@gmail.com wrote:

 Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
 pointers. Currently games usually rely on opening the evdev device
 themself, and so far it doesn't seem reasonable to abstract such devices
 in the compositor. What may make more sense is to rely on the compositor
 to handle focus, passing fds around, continuing to make the client
 responsible for translating input events to character movements or
 whatever.

Passing around and revoking fds is how a raw generic interface to access
an input device would be implemented. It would be up to the client to
then use any appropriate code to handle the readily-opened evdev kernel
device.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Michal Suchanek
On 17 April 2015 at 09:11, Pekka Paalanen ppaala...@gmail.com wrote:
 On Fri, 17 Apr 2015 13:43:11 +0900
 x414e54 x414...@linux.com wrote:

 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.

 
  Not sure an IR/laser/wii mote pointer should even be considered a
  relative pointer since they operate in absolute coordinates. Given
  this, there is no set position hint to consider. Transmitting
  acceleramoter data via a relative pointer doesn't sound reasonable.
 

 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.

 What definition of a pointer are you using?

 The definition Wayland uses for a wl_pointer is a device that:
 - requires a cursor image on screen to be usable
 - the physical input is relative, not absolute

 This definition is inspired by mice, and mice have been called pointer
 devices, so we picked the well-known name pointer for mice-like
 devices.

 Specifically, a pointer is *not* a device where you directly point a
 location on screen, like a touchscreen for example. For touchscreens,
 there is a separate protocol wl_touch.

 For drawing tablets, there will be yet another procotol.

 Joysticks or gamepads fit into none of the above. For the rest of the
 conversation, you should probably look up the long gamepad protocol
 discussions from the wayland-devel mailing list archives.

And how is a joystick different from a trackpoint, exactly?

It uses different hardware interface and later different software
interface but for no good reason. It's just 2 axis relative input
device with buttons. Sure, the big joystick, gamepad directional cap
and trackpoint are at a different place of the stick size scale and
might have different hardware sensors which should be reflected with
different acceleration settings but ultimately it's the same kind of
device.


 A fundamental difference between a wiimote and a pointer, as far as I
 understand, is that wiimote might be off-screen while a pointer never
 can. You also would not unfocus a wiimote from an app window just
 because it went off-screen or off-window, right? Button events should
 still be delivered to the app? A Pointer will unfocus, because without
 grabs, the focus is expected to shift to whatever is under the pointer.

And why should wiimote not unfocus unless grabbed?

I am not sure how wiimote actually works but from your comments it
seems it's some absolute pointing device with buttons. I should be
able to use an absolute pointing device with buttons as pointer input
if I choose so. In fact, I am using my Wacom tablet that way right now
in X11 which happens to be an absolute pointing device with buttons.
And due to aspect mismatch my pointer can technically go off-screen.
And I will not change to a windowing system that does not allow that.
Similarly I should be able to map the Wacom tablet for exclusive use
with a particular application window or the application window
currently in focus. I do not see any reason why the wiimote should be
special and different and only allow mapping to a particular
application.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Michal Suchanek
On 17 April 2015 at 12:52, Hans de Goede hdego...@redhat.com wrote:
 Hi,


 On 17-04-15 11:47, Michal Suchanek wrote:

 On 17 April 2015 at 09:11, Pekka Paalanen ppaala...@gmail.com wrote:

 On Fri, 17 Apr 2015 13:43:11 +0900
 x414e54 x414...@linux.com wrote:

 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.


 What definition of a pointer are you using?

 The definition Wayland uses for a wl_pointer is a device that:
 - requires a cursor image on screen to be usable
 - the physical input is relative, not absolute

 This definition is inspired by mice, and mice have been called pointer
 devices, so we picked the well-known name pointer for mice-like
 devices.

 Specifically, a pointer is *not* a device where you directly point a
 location on screen, like a touchscreen for example. For touchscreens,
 there is a separate protocol wl_touch.

 For drawing tablets, there will be yet another procotol.

 Joysticks or gamepads fit into none of the above. For the rest of the
 conversation, you should probably look up the long gamepad protocol
 discussions from the wayland-devel mailing list archives.


 And how is a joystick different from a trackpoint, exactly?

 It uses different hardware interface and later different software
 interface but for no good reason. It's just 2 axis relative input
 device with buttons. Sure, the big joystick, gamepad directional cap
 and trackpoint are at a different place of the stick size scale and
 might have different hardware sensors which should be reflected with
 different acceleration settings but ultimately it's the same kind of
 device.


 Actually joystick analog inputs are absolute not relative. They give a value
 for exactly how much the stick has moved from the center.

 Except for dpads which are really buttons not relative axis, so joysticks
 really are pretty much not like trackpoints in anyway.


Do you mean that the absolute trackpoint excentricity is somehow
translated to relative motion delta in hardware so that it does look
like a mouse although it is in fact a joystick?

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread x414e54
Yes thank you for the comments.

I have added a bug/enhancement report related to dealing with this
from the evdev fd or libinput side. I do not have a huge amount of
time so please feel free to re-word it if you think of a better way.


If the compositor can handle this by creating its own evdev device fd
then having the client use libinput to receive the raw relative mouse
motion events and dpi information then this seems a very acceptable
solution. I am not sure though if it would be a good idea to expose
the original evdev device directly as data might need to be
transformed or understood by the compositor in the case such as the
home button on a gamepad maybe should act like the super key.


The wiimote example was mainly for GUI uses in this case you get an
implicit grab serial from the button down ask for the pointer to be
frozen/hidden and then receive events though the evdev fd (which may
be using the accelerometer).

For using a wiimote in a windowed game it is the confinement/warping
hint that is the main issue rather than the focus lock.


Thank you for the help and good luck with the API!
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Michal Suchanek
On 17 April 2015 at 14:37, Hans de Goede hdego...@redhat.com wrote:
 Hi,


 On 17-04-15 13:17, Michal Suchanek wrote:

 On 17 April 2015 at 12:52, Hans de Goede hdego...@redhat.com wrote:

 Hi,


 On 17-04-15 11:47, Michal Suchanek wrote:


 On 17 April 2015 at 09:11, Pekka Paalanen ppaala...@gmail.com wrote:


 On Fri, 17 Apr 2015 13:43:11 +0900
 x414e54 x414...@linux.com wrote:

 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.



 What definition of a pointer are you using?

 The definition Wayland uses for a wl_pointer is a device that:
 - requires a cursor image on screen to be usable
 - the physical input is relative, not absolute

 This definition is inspired by mice, and mice have been called pointer
 devices, so we picked the well-known name pointer for mice-like
 devices.

 Specifically, a pointer is *not* a device where you directly point a
 location on screen, like a touchscreen for example. For touchscreens,
 there is a separate protocol wl_touch.

 For drawing tablets, there will be yet another procotol.

 Joysticks or gamepads fit into none of the above. For the rest of the
 conversation, you should probably look up the long gamepad protocol
 discussions from the wayland-devel mailing list archives.



 And how is a joystick different from a trackpoint, exactly?

 It uses different hardware interface and later different software
 interface but for no good reason. It's just 2 axis relative input
 device with buttons. Sure, the big joystick, gamepad directional cap
 and trackpoint are at a different place of the stick size scale and
 might have different hardware sensors which should be reflected with
 different acceleration settings but ultimately it's the same kind of
 device.



 Actually joystick analog inputs are absolute not relative. They give a
 value
 for exactly how much the stick has moved from the center.

 Except for dpads which are really buttons not relative axis, so joysticks
 really are pretty much not like trackpoints in anyway.


 Do you mean that the absolute trackpoint excentricity is somehow
 translated to relative motion delta in hardware so that it does look
 like a mouse although it is in fact a joystick?


 Yes.

 Also have you ever used a trackpoint it is really nothing like a joystick,
 with a joystick you move the stick and then it stays in position (there
 are springs to center the stick when you let go, but you can remove those
 and everything will still work just fine).

 Where as a trackpoint is more of a presure sensor which senses how much you
 push against it in a certain direction, it does not actually move.

That's implementation detail. The input concept is the same. And yes,
it might be hard to see the similarity between a full size joystick
and a trackpoint. When you throw in all those GPIO mini joysticks and
the gamepad directional joystick-like inputs and half size joysticks
and arcade sticks you can see that there is a concept of stick input
that scales to different sizes with different limitations.

Thanks

Michal
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Hans de Goede

Hi,

On 17-04-15 11:47, Michal Suchanek wrote:

On 17 April 2015 at 09:11, Pekka Paalanen ppaala...@gmail.com wrote:

On Fri, 17 Apr 2015 13:43:11 +0900
x414e54 x414...@linux.com wrote:


Thank you for the comments.
I do have a few counterpoints but I will leave after that.



Not sure an IR/laser/wii mote pointer should even be considered a
relative pointer since they operate in absolute coordinates. Given
this, there is no set position hint to consider. Transmitting
acceleramoter data via a relative pointer doesn't sound reasonable.



I think this is the issue right here. Pointers are not relative, mice
are not pointers.


What definition of a pointer are you using?

The definition Wayland uses for a wl_pointer is a device that:
- requires a cursor image on screen to be usable
- the physical input is relative, not absolute

This definition is inspired by mice, and mice have been called pointer
devices, so we picked the well-known name pointer for mice-like
devices.

Specifically, a pointer is *not* a device where you directly point a
location on screen, like a touchscreen for example. For touchscreens,
there is a separate protocol wl_touch.

For drawing tablets, there will be yet another procotol.

Joysticks or gamepads fit into none of the above. For the rest of the
conversation, you should probably look up the long gamepad protocol
discussions from the wayland-devel mailing list archives.


And how is a joystick different from a trackpoint, exactly?

It uses different hardware interface and later different software
interface but for no good reason. It's just 2 axis relative input
device with buttons. Sure, the big joystick, gamepad directional cap
and trackpoint are at a different place of the stick size scale and
might have different hardware sensors which should be reflected with
different acceleration settings but ultimately it's the same kind of
device.


Actually joystick analog inputs are absolute not relative. They give a value
for exactly how much the stick has moved from the center.

Except for dpads which are really buttons not relative axis, so joysticks
really are pretty much not like trackpoints in anyway.

Regards,

Hans
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Pekka Paalanen
On Fri, 17 Apr 2015 19:14:30 +0900
x414e54 x414...@linux.com wrote:

 If the compositor can handle this by creating its own evdev device fd
 then having the client use libinput to receive the raw relative mouse
 motion events and dpi information then this seems a very acceptable
 solution. I am not sure though if it would be a good idea to expose
 the original evdev device directly as data might need to be
 transformed or understood by the compositor in the case such as the
 home button on a gamepad maybe should act like the super key.

The point of passing the evdev fd to a client is to take the compositor
completely off the input path. For things like game controllers where
you may want high sampling rates of many different sensors, having the
compositor as a relay in between just adds latency and CPU load.

But yes, it indeed does have the downside, that if an input device is
forwarded to a client like that, the compositor can no longer see the
input events. Or maybe it could get a copy of the stream, but that
would a) bring problems with who is handling what event, and b) not
allow all the CPU savings as the compositor would need to check all
events to filter out the Home button events.

I think a solution to this could be splitting a single physical game
controller into multiple evdev devices already in the kernel. I don't
know how you would currently choose to get or not, say, accelerometer
inputs from a PS3 or 4 controller. Putting the Home button alone into a
new evdev device would solve the problem nicely. Does it already work
that way?


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-17 Thread Bill Spitzak



On 04/16/2015 07:51 PM, x414e54 wrote:

Hi,

I am wondering if there has been any recent progress on the stability of
the Wayland relative pointer API?

I had a few ideas on the original December implementation:

Whilst we definitely need the relative event support, I feel that the
client should never be allowed to warp or confine the global pointer
even if it is just a hint. There may be cases such as IR or laser
pointer devices (e.g. wii mote) which can never guarantee a warp or
pointer confinement but can still transmit accelerometer data as
relative pointer motion. Looking at most toolkits and applications they
cannot be trusted even with hints.

I think the API needs to be split into two use cases:


I agree and about 90% of what I am interested in falls in the first case:


1. GUI sliders etc. - They should be allowed to freeze the pointer and
receive relative events based on an implicit button down grab similar to
the drag and drop protocol. They should not be allowed to warp or
confine and upon button up the grab is lost. Gnome already looks like it
uses heuristics todo this if the cursor is hidden on button down.


They MUST be able to move the cursor image. Blanking it and replacing it 
with another surface will BLINK, which is a direct violation of basic 
Wayland principle that every frame is perfect. In addition for many 
input devices the compositor will need to know where the user thinks the 
cursor is so it can put the real cursor there when the grab finishes. In 
addition it means you can't reuse any specialization the compositor did 
with the cursor image (such as putting it in a special hardware plane).


I don't think this is very hard:

- the client can lock the pointer in response to an event such as a 
mouse-down.


- This makes the cursor image STOP (it does not disappear, does not 
change appearance, and does not move from where it happens to be). 
NOTHING visible happens!


- The client can change the cursor image and move it to any location 
they want (compositor may restrict this to the client input area). 
Ideally this should be in sync with buffer updates.


- Client gets events as the user moves the input pointer. One thing it 
gets is IDENTICAL to the location they would have gotten if the lock had 
not happened, but instead a grab had happened and also (if the device 
supports it) the screen had unlimited area. It also appears some clients 
will want unaccelerated motion as another event.


- When the mouse is released the lock is lost. This event is also 
delivered to the client, with the same xy position it would have had if 
the lock had never happened.


- On an absolute device the cursor most likely then jumps to the actual 
location of the absolute pointer, producing enter/exit events as needed. 
On a relative device (ie a mouse) the cursor should probably remain 
where it is.


I have certainly moved GUI sliders that slow the cursor with an absolute 
device (a Wacom tablet) and there is no problem. The cursor jumping when 
I release the slider is perfectly natural. I believe the same thing 
applies to touch screens. So there is no reason to say this should only 
work with real mice.



1. Games - They do not really need relative pointer events they just
want the current seat mapped to a 6DOF or joystick style input. They
should be allowed to request some kind of wl_joystick or wl_6dof
interface. Then the compositor can decide what it actually presents to
the application for using that input. Maybe a user has a joystick they
always select to use instead of their mouse or they have an
accelerometer device etc. It is then up to the compositor what it does
with the actual on screen cursor if it confines it or hides it etc,
there could be a notification of entering game mode etc. If the
compositor is not using the same input device for wl_pointer and a
wl_joystick or wl_6dof then it does nothing. This would also allow a
user to hot-swap the device between mouse and keyboard and a gamepad
just by using the WM settings. It could also allow for using 6DOF or 3D
mice in an application which is also mapped as the default x, y pointer.

The application will then still receive absolute pointer events which it
can use for in game GUI clicks.


A big problem with just saying the game must use the joystick api is 
that the game won't work on a machine without a joystick unless the 
joystick api is emulated for the mouse. This seems to me to be exactly 
the same problem and requiring exactly the same solutions, except you 
have moved the code from the client to the compositor, which is usually 
a bad idea. Also you have made it a pain in the ass to create simple 
toolkits since they now have to provide the joystick api.


I believe this can be done exactly as above except releasing the mouse 
button does not cancel the lock. There is no need for a restriction 
rectangle, since the game can just position the cursor wherever it wants 
and thus restrict it (it can also blank it 

Re: Wayland Relative Pointer API Progress

2015-04-17 Thread x414e54
 A big problem with just saying the game must use the joystick api is that
 the game won't work on a machine without a joystick unless the joystick api
 is emulated for the mouse. This seems to me to be exactly the same problem
 and requiring exactly the same solutions, except you have moved the code
 from the client to the compositor, which is usually a bad idea. Also you
 have made it a pain in the ass to create simple toolkits since they now have
 to provide the joystick api.

By joystick I mean generic axis device with buttons we do not mean
an actually joystick, controller api may be a better word. If my
gaming mouse has 40 buttons, and a 7000+ dpi laser 3D mouse wheel (not
just the movement sensor), I want ALL of that information not just
what the wayland protocol designers think is best.

Obviously a joystick is confusing and more restrictive than allowing
the full access to the device marshalled by the compositor and then
the user can run it though whichever api they want this is why I
prefer Jonas' idea of using the actual evdev device but the compositor
can MITM the char buffer or something similar.

This is EXACTLY how games work on other platforms when using mouse,
they either use a raw USB/HID input api or in the case of older games
older than windows xp they use directinput. Obviously on windows you
would use xinput (which will do more abstracting for you) and rawinput
but on mac you use HIDManager for everything which also supports raw
mouse input. The point is using a well abstracted api you can do not
care what you get as long as it has at least 2 relative axis and 2
buttons.

You want the compositor todo this because the game knows nothing about
which evdev input device is associated with which seat, how does it
know what to open? Also this would be good from a security side as you
do not have to allow full user access to the evdev device which
another user on the same system may want to use. You can have the
compositor refer to a security or sandboxing api.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-16 Thread x414e54
If you add in something like get a wl_input from a wl_seat which can
be used as a generic interface to access the libinput directly in a
safe way but still controlled the compositor if the window loses focus
or there needs to be some translation done. This would be much more
generic than my wl_jostick or wl_6dof proposal.

On Fri, Apr 17, 2015 at 1:48 PM, x414e54 x414...@linux.com wrote:
 Actually sorry I was wrong. UE4 uses HID raw input but there are some
 older game engines that used to use directinput.

 On Fri, Apr 17, 2015 at 1:43 PM, x414e54 x414...@linux.com wrote:
 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.

 I think this is perfectly fine to how it would work in practice
 because when you button down on a GUI widget and ask for a relative
 pointer the compositor just hides the pointer switches to the wii mote
 accelerometer and transforms the motion into a 2d projection. If it
 does not have an accelerometer then the relative motion finishes at
 the edge of the input area.

 Sliders etc will be possible with the pointer lock and relative pointer
 protocols. Confinement has other use cases.

 Yes but this protocol should be revised to take an implicit grab
 serial and freeze the pointer and be separate to the
 explicit/long-term grab required for the game api.

 It needs to be separate to prevent GUI applications receiving a
 generic explicit grab on the pointer when they do not need it.

 Well, they do. Sure, consoles don't tend to use a mouse for input, but
 on PC, a very large amount of games do. And a large amount of those tend
 to use it in relative mode i.e. not to show a cursor (Quake,
 Neverball, ...).

 I am not sure if you have made a game but I have shipped a few and I
 am currently sitting in front of the source for a major game engine
 and the relative pointer is really just a 2 axis joystick delta. It
 is labeled separately from the gamepad thumb axis but it still reaches
 exactly the same api call at the end which is then using the 2 axis
 deltas to rotate or transform a 3d view. Even with absolute mouse
 positions it is still transformed into an axis delta.

 People seem to think because mouse input is superior look aim in FPS
 to a thumb-stick or because one is mapped to a ursor that they are
 somehow completely different programatically... they are not.


 Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
 pointers. Currently games usually rely on opening the evdev device
 themself, and so far it doesn't seem reasonable to abstract such devices
 in the compositor. What may make more sense is to rely on the compositor
 to handle focus, passing fds around, continuing to make the client
 responsible for translating input events to character movements or
 whatever.

 On windows you would use direct input to open the mouse for relative
 motion events, this is exactly the same API as used for joysticks. In
 direct input a mouse is just a 2 axis direct input device with the
 mouse label.
 From the point of games they really are not orthogonal. UE4 on window
 uses direct input (joystick api) for it's high resolution mouse
 mode.

 There are various reasons why you should also need to abstract the
 joystick API and that could be multiple games running at the same time
 trying to access the joystick. You need to give the focus window
 access for example. Also in 3D a VR compositor you would need to
 translate plenty of matrices from the HMD tracking and the 6DOF input
 devices to get the correct view of a 3D window. It would not be
 reasonable to allow an application to access these devices directly.

 It doesn't make any sense to compositor driven hot-swap what input device
 type a game should use. Many games probably wouldn't even support
 changing from a mouse+keyboard to a gamepad simply because the gameplay
 would be different. A game needs to decide itself what input device type
 i t should use.

 As above it games just check the axis label against a list of mappings
 to work out if it is a rotate or translate, etc. You could perfectly
 have a gamepad thumb-stick as movement and a mouse as look and have
 the compositor swap the two inputs without the game caring. Internally
 the game would still think this is the mouse axis it is irrelevant
 if it is coming from a different device than the wl_pointer, it just
 needs to be from the same wl_seat.

 Emulating pointer devices from controller input I think is completely
 out of scope for any protocol. It can be done by something server side
 if needed.

 This is exactly what the compositor is currently doing with 

Re: Wayland Relative Pointer API Progress

2015-04-16 Thread x414e54
Actually sorry I was wrong. UE4 uses HID raw input but there are some
older game engines that used to use directinput.

On Fri, Apr 17, 2015 at 1:43 PM, x414e54 x414...@linux.com wrote:
 Thank you for the comments.
 I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


 I think this is the issue right here. Pointers are not relative, mice
 are not pointers.

 I think this is perfectly fine to how it would work in practice
 because when you button down on a GUI widget and ask for a relative
 pointer the compositor just hides the pointer switches to the wii mote
 accelerometer and transforms the motion into a 2d projection. If it
 does not have an accelerometer then the relative motion finishes at
 the edge of the input area.

 Sliders etc will be possible with the pointer lock and relative pointer
 protocols. Confinement has other use cases.

 Yes but this protocol should be revised to take an implicit grab
 serial and freeze the pointer and be separate to the
 explicit/long-term grab required for the game api.

 It needs to be separate to prevent GUI applications receiving a
 generic explicit grab on the pointer when they do not need it.

 Well, they do. Sure, consoles don't tend to use a mouse for input, but
 on PC, a very large amount of games do. And a large amount of those tend
 to use it in relative mode i.e. not to show a cursor (Quake,
 Neverball, ...).

 I am not sure if you have made a game but I have shipped a few and I
 am currently sitting in front of the source for a major game engine
 and the relative pointer is really just a 2 axis joystick delta. It
 is labeled separately from the gamepad thumb axis but it still reaches
 exactly the same api call at the end which is then using the 2 axis
 deltas to rotate or transform a 3d view. Even with absolute mouse
 positions it is still transformed into an axis delta.

 People seem to think because mouse input is superior look aim in FPS
 to a thumb-stick or because one is mapped to a ursor that they are
 somehow completely different programatically... they are not.


 Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
 pointers. Currently games usually rely on opening the evdev device
 themself, and so far it doesn't seem reasonable to abstract such devices
 in the compositor. What may make more sense is to rely on the compositor
 to handle focus, passing fds around, continuing to make the client
 responsible for translating input events to character movements or
 whatever.

 On windows you would use direct input to open the mouse for relative
 motion events, this is exactly the same API as used for joysticks. In
 direct input a mouse is just a 2 axis direct input device with the
 mouse label.
 From the point of games they really are not orthogonal. UE4 on window
 uses direct input (joystick api) for it's high resolution mouse
 mode.

 There are various reasons why you should also need to abstract the
 joystick API and that could be multiple games running at the same time
 trying to access the joystick. You need to give the focus window
 access for example. Also in 3D a VR compositor you would need to
 translate plenty of matrices from the HMD tracking and the 6DOF input
 devices to get the correct view of a 3D window. It would not be
 reasonable to allow an application to access these devices directly.

 It doesn't make any sense to compositor driven hot-swap what input device
 type a game should use. Many games probably wouldn't even support
 changing from a mouse+keyboard to a gamepad simply because the gameplay
 would be different. A game needs to decide itself what input device type
 i t should use.

 As above it games just check the axis label against a list of mappings
 to work out if it is a rotate or translate, etc. You could perfectly
 have a gamepad thumb-stick as movement and a mouse as look and have
 the compositor swap the two inputs without the game caring. Internally
 the game would still think this is the mouse axis it is irrelevant
 if it is coming from a different device than the wl_pointer, it just
 needs to be from the same wl_seat.

 Emulating pointer devices from controller input I think is completely
 out of scope for any protocol. It can be done by something server side
 if needed.

 This is exactly what the compositor is currently doing with mouse input.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-16 Thread x414e54
Thank you for the comments.
I do have a few counterpoints but I will leave after that.


 Not sure an IR/laser/wii mote pointer should even be considered a
 relative pointer since they operate in absolute coordinates. Given
 this, there is no set position hint to consider. Transmitting
 acceleramoter data via a relative pointer doesn't sound reasonable.


I think this is the issue right here. Pointers are not relative, mice
are not pointers.

I think this is perfectly fine to how it would work in practice
because when you button down on a GUI widget and ask for a relative
pointer the compositor just hides the pointer switches to the wii mote
accelerometer and transforms the motion into a 2d projection. If it
does not have an accelerometer then the relative motion finishes at
the edge of the input area.

 Sliders etc will be possible with the pointer lock and relative pointer
 protocols. Confinement has other use cases.

Yes but this protocol should be revised to take an implicit grab
serial and freeze the pointer and be separate to the
explicit/long-term grab required for the game api.

It needs to be separate to prevent GUI applications receiving a
generic explicit grab on the pointer when they do not need it.

 Well, they do. Sure, consoles don't tend to use a mouse for input, but
 on PC, a very large amount of games do. And a large amount of those tend
 to use it in relative mode i.e. not to show a cursor (Quake,
 Neverball, ...).

I am not sure if you have made a game but I have shipped a few and I
am currently sitting in front of the source for a major game engine
and the relative pointer is really just a 2 axis joystick delta. It
is labeled separately from the gamepad thumb axis but it still reaches
exactly the same api call at the end which is then using the 2 axis
deltas to rotate or transform a 3d view. Even with absolute mouse
positions it is still transformed into an axis delta.

People seem to think because mouse input is superior look aim in FPS
to a thumb-stick or because one is mapped to a ursor that they are
somehow completely different programatically... they are not.


 Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
 pointers. Currently games usually rely on opening the evdev device
 themself, and so far it doesn't seem reasonable to abstract such devices
 in the compositor. What may make more sense is to rely on the compositor
 to handle focus, passing fds around, continuing to make the client
 responsible for translating input events to character movements or
 whatever.

On windows you would use direct input to open the mouse for relative
motion events, this is exactly the same API as used for joysticks. In
direct input a mouse is just a 2 axis direct input device with the
mouse label.
From the point of games they really are not orthogonal. UE4 on window
uses direct input (joystick api) for it's high resolution mouse
mode.

There are various reasons why you should also need to abstract the
joystick API and that could be multiple games running at the same time
trying to access the joystick. You need to give the focus window
access for example. Also in 3D a VR compositor you would need to
translate plenty of matrices from the HMD tracking and the 6DOF input
devices to get the correct view of a 3D window. It would not be
reasonable to allow an application to access these devices directly.

 It doesn't make any sense to compositor driven hot-swap what input device
 type a game should use. Many games probably wouldn't even support
 changing from a mouse+keyboard to a gamepad simply because the gameplay
 would be different. A game needs to decide itself what input device type
i t should use.

As above it games just check the axis label against a list of mappings
to work out if it is a rotate or translate, etc. You could perfectly
have a gamepad thumb-stick as movement and a mouse as look and have
the compositor swap the two inputs without the game caring. Internally
the game would still think this is the mouse axis it is irrelevant
if it is coming from a different device than the wl_pointer, it just
needs to be from the same wl_seat.

 Emulating pointer devices from controller input I think is completely
 out of scope for any protocol. It can be done by something server side
 if needed.

This is exactly what the compositor is currently doing with mouse input.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Wayland Relative Pointer API Progress

2015-04-16 Thread Jonas Ådahl
On Fri, Apr 17, 2015 at 11:51:41AM +0900, x414e54 wrote:
 Hi,
 
 I am wondering if there has been any recent progress on the stability of
 the Wayland relative pointer API?

AFAIK it is being / to be reviewed a bit more. I have some kind of plan
to submit a version with the previous issues addressed, but haven't
gotten to it yet.

 
 I had a few ideas on the original December implementation:
 
 Whilst we definitely need the relative event support, I feel that the
 client should never be allowed to warp or confine the global pointer even
 if it is just a hint. There may be cases such as IR or laser pointer
 devices (e.g. wii mote) which can never guarantee a warp or pointer
 confinement but can still transmit accelerometer data as relative pointer
 motion. Looking at most toolkits and applications they cannot be trusted
 even with hints.

Not sure an IR/laser/wii mote pointer should even be considered a
relative pointer since they operate in absolute coordinates. Given
this, there is no set position hint to consider. Transmitting
acceleramoter data via a relative pointer doesn't sound reasonable.

 
 I think the API needs to be split into two use cases:
 
 1. GUI sliders etc. - They should be allowed to freeze the pointer and
 receive relative events based on an implicit button down grab similar to
 the drag and drop protocol. They should not be allowed to warp or confine
 and upon button up the grab is lost. Gnome already looks like it uses
 heuristics todo this if the cursor is hidden on button down.

Sliders etc will be possible with the pointer lock and relative pointer
protocols. Confinement has other use cases.

 
 1. Games - They do not really need relative pointer events they just want
 the current seat mapped to a 6DOF or joystick style input. They should be
 allowed to request some kind of wl_joystick or wl_6dof interface. Then the
 compositor can decide what it actually presents to the application for
 using that input. Maybe a user has a joystick they always select to use
 instead of their mouse or they have an accelerometer device etc. It is then
 up to the compositor what it does with the actual on screen cursor if it
 confines it or hides it etc, there could be a notification of entering
 game mode etc. If the compositor is not using the same input device for
 wl_pointer and a wl_joystick or wl_6dof then it does nothing. This would
 also allow a user to hot-swap the device between mouse and keyboard and a
 gamepad just by using the WM settings. It could also allow for using 6DOF
 or 3D mice in an application which is also mapped as the default x, y
 pointer.
 
 The application will then still receive absolute pointer events which it
 can use for in game GUI clicks.
 

Well, they do. Sure, consoles don't tend to use a mouse for input, but
on PC, a very large amount of games do. And a large amount of those tend
to use it in relative mode i.e. not to show a cursor (Quake,
Neverball, ...).

Joysticks, gamepads, 6DOF are orthagonal to pointer locking and relative
pointers. Currently games usually rely on opening the evdev device
themself, and so far it doesn't seem reasonable to abstract such devices
in the compositor. What may make more sense is to rely on the compositor
to handle focus, passing fds around, continuing to make the client
responsible for translating input events to character movements or
whatever.

It doesn't make any sense to compositor driven hot-swap what input device
type a game should use. Many games probably wouldn't even support
changing from a mouse+keyboard to a gamepad simply because the gameplay
would be different. A game needs to decide itself what input device type
it should use.

Emulating pointer devices from controller input I think is completely
out of scope for any protocol. It can be done by something server side
if needed.


Jonas

 Any opinions on this would be appreciated.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Wayland Relative Pointer API Progress

2015-04-16 Thread x414e54
Hi,

I am wondering if there has been any recent progress on the stability of
the Wayland relative pointer API?

I had a few ideas on the original December implementation:

Whilst we definitely need the relative event support, I feel that the
client should never be allowed to warp or confine the global pointer even
if it is just a hint. There may be cases such as IR or laser pointer
devices (e.g. wii mote) which can never guarantee a warp or pointer
confinement but can still transmit accelerometer data as relative pointer
motion. Looking at most toolkits and applications they cannot be trusted
even with hints.

I think the API needs to be split into two use cases:

1. GUI sliders etc. - They should be allowed to freeze the pointer and
receive relative events based on an implicit button down grab similar to
the drag and drop protocol. They should not be allowed to warp or confine
and upon button up the grab is lost. Gnome already looks like it uses
heuristics todo this if the cursor is hidden on button down.

1. Games - They do not really need relative pointer events they just want
the current seat mapped to a 6DOF or joystick style input. They should be
allowed to request some kind of wl_joystick or wl_6dof interface. Then the
compositor can decide what it actually presents to the application for
using that input. Maybe a user has a joystick they always select to use
instead of their mouse or they have an accelerometer device etc. It is then
up to the compositor what it does with the actual on screen cursor if it
confines it or hides it etc, there could be a notification of entering
game mode etc. If the compositor is not using the same input device for
wl_pointer and a wl_joystick or wl_6dof then it does nothing. This would
also allow a user to hot-swap the device between mouse and keyboard and a
gamepad just by using the WM settings. It could also allow for using 6DOF
or 3D mice in an application which is also mapped as the default x, y
pointer.

The application will then still receive absolute pointer events which it
can use for in game GUI clicks.

Any opinions on this would be appreciated.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel