Input and games.

2013-04-18 Thread Todd Showalter
I'm a game developer, and we're hoping to have our games working
properly with Wayland.  Input is a particular point of interest for
me. The traditional desktop input model is what tends to drive input
interfaces, but games have somewhat unique requirements that at times
mesh badly with the standard desktop model.

Is there a roadmap for input support I can look over?  Is there
anything I can do to help make Wayland game-friendly?

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-18 Thread Jonas Kulla
2013/4/18 Todd Showalter 

> I'm a game developer, and we're hoping to have our games working
> properly with Wayland.  Input is a particular point of interest for
> me. The traditional desktop input model is what tends to drive input
> interfaces, but games have somewhat unique requirements that at times
> mesh badly with the standard desktop model.
>
> Is there a roadmap for input support I can look over?  Is there
> anything I can do to help make Wayland game-friendly?
>
> Todd.
>
> --
>  Todd Showalter, President,
>  Electron Jump Games, Inc.
>

Hi Todd!

What exactly do you mean by "unique requirements", can you be a little
bit more specific? In general I think the current consensus (correct me if
I'm wrong) is that using the default wayland pointer and keyboard events
plus Joypad support via SDL is sufficient for most purposes.

Personally, I'd be interested in seeing joypads become first class input
devices on wayland (as a capability of wl_seat alongside mice/keyboard
etc.),
seeing that there are already evdev drivers existing for most gamepads.
But I'm unfortunately lacking experience and knowledge in that field,
otherwise I'd give it a hacking attempt myself.

So yeah, for now I think SDL should serve you perfectly well =)

Jonas
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-18 Thread Todd Showalter
On Thu, Apr 18, 2013 at 5:29 PM, Jonas Kulla  wrote:

> What exactly do you mean by "unique requirements", can you be a little
> bit more specific? In general I think the current consensus (correct me if
> I'm wrong) is that using the default wayland pointer and keyboard events
> plus Joypad support via SDL is sufficient for most purposes.

In general we can work with anything as long as we can get the
right events and process them; it's perhaps more a matter of
convenience.

There are a few things of note that are somewhat specific to games:

Permissions

We're often running things like mods, user-generated scripts, and
in general lots of untrusted content, so the fewer privileges we need
to handle things like input, the better.

Hooking Things Up

This may be beyond the scope of Wayland, but at least in the past
I've found that in particular joysticks/gamepads are a bit of a
guessing game for the developer.  You can usually assume that the
first stick is the first couple of axis values in the axis array, but
after that, it's a tossup whether an analog axis is part of a stick a
trigger, or a pressure-sensitive button.

It would be really nice if there was some sort of configuration
that could be read so we'd know how the player wanted these things
mapped, and some sort of way for the player to set that configuration
up outside the game.

Event Driven vs. Polling

Modern gui applications tend to be event-driven, which makes
sense; most modern desktop applications spend most of their time doing
nothing and waiting for the user to generate input.  Games are
different, in that they tend to be simulation-based, and things are
happening regardless of whether the player is providing input.

In most games, you have to poll input between simulation ticks.
If you accept and process an input event in the middle of a simulation
tick, your simulation will likely be internally inconsistent.  Input
in games typically moves or changes in-game objects, and if input
affects an object mid-update, part of the simulation tick will have
been calculated based on the old state of the object, and the rest
will be based on the new state.

To deal with this on event-driven systems, games must either
directly poll the input system, or else accumulate events and process
them between simulation ticks.  Either works, but being able to poll
means the game needs to do less work.

Input Sources & Use

Sometimes games want desktop-style input (clicking buttons,
entering a name with the keyboard), but often games want to treat all
the available input data as either digital values (mouse buttons,
keyboard keys, gamepad buttons...), constrained-axis "analog" (gamepad
triggers, joysticks) or unconstrained axis "analog" (mouse/trackball).
 Touch input is a bit of a special case, since it's nearly without
context.

Games usually care about all of:

- the state of buttons/keys -- whether they are currently down or up
-- think WASD here
- edge detection of buttons/keys -- trigger, release and state change
- the value of each input axis -- joystick deflection, screen position
of the cursor, etc
- the delta of each input axis

From what I've seen, SDL does not give us the button/key state
without building a layer on top of it; we only get edge detection.
Likewise, as far as I understand nothing does deltas.

Input Capture

It would be very helpful to have an input capture mechanism that
could be turned on and off easily; I'd like to be able to have mouse
input captured when a game is playing, but be able to shut off the
mouse capture if the player brings up the pause menu.  I'd also like
it to deactivate if the game crashes, because at least in development
that can happen a lot.

> Personally, I'd be interested in seeing joypads become first class input
> devices on wayland (as a capability of wl_seat alongside mice/keyboard
> etc.),

Hear hear!

> seeing that there are already evdev drivers existing for most gamepads.
> But I'm unfortunately lacking experience and knowledge in that field,
> otherwise I'd give it a hacking attempt myself.
>
> So yeah, for now I think SDL should serve you perfectly well =)

SDL works, but it's not ideal; SDL maintains a lot of the desktop
impedance mismatch with games that desktop environments have without
it.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-19 Thread Pekka Paalanen
Hi Todd,

I am going to reply from the Wayland protocol point of view, and what
Wayland explicitly can (and must) do for you. This is likely much lower
level than what a game programmer would like to use. How SDL or some
other higher level library exposes input is a different matter, and I
will not comment on that. We just want to make everything possible on
the Wayland protocol level.


On Thu, 18 Apr 2013 18:22:11 -0400
Todd Showalter  wrote:

> On Thu, Apr 18, 2013 at 5:29 PM, Jonas Kulla 
> wrote:
> 
> > What exactly do you mean by "unique requirements", can you be a
> > little bit more specific? In general I think the current consensus
> > (correct me if I'm wrong) is that using the default wayland pointer
> > and keyboard events plus Joypad support via SDL is sufficient for
> > most purposes.
> 
> In general we can work with anything as long as we can get the
> right events and process them; it's perhaps more a matter of
> convenience.
> 
> There are a few things of note that are somewhat specific to
> games:
> 
> Permissions
> 
> We're often running things like mods, user-generated scripts, and
> in general lots of untrusted content, so the fewer privileges we need
> to handle things like input, the better.

I do not think we can happily let client applications open input devices
themselves, so this is clearly a thing we need to improve on. In other
words, I believe we should come up with a protocol extension where the
server opens the input devices, and either passes the file descriptor to
a client, or the server translates evdev events into Wayland protocol
events. "How" and "what" are still open questions, as is every other
detail of input devices that are not keyboards, mice, or touchscreens.

There was once some talk about "raw input event protocol", but there is
not even a sketch of it, AFAIK.

> Hooking Things Up
> 
> This may be beyond the scope of Wayland, but at least in the past
> I've found that in particular joysticks/gamepads are a bit of a
> guessing game for the developer.  You can usually assume that the
> first stick is the first couple of axis values in the axis array, but
> after that, it's a tossup whether an analog axis is part of a stick a
> trigger, or a pressure-sensitive button.
> 
> It would be really nice if there was some sort of configuration
> that could be read so we'd know how the player wanted these things
> mapped, and some sort of way for the player to set that configuration
> up outside the game.

Right, and whether this could be a Wayland thing or not, depends on the
above, how to handle misc input devices in general.

Keyboards already have extensive mapping capabilities. A Wayland server
sends keycodes (I forget in which space exactly) and a keymap, and
clients feed the keymap and keycodes into libxkbcommon, which
translates them into something actually useful. Maybe something similar
could be invented for game controllers? But yes, this is off-topic for
Wayland, apart from the protocol of what event codes and other data to
pass.

> Event Driven vs. Polling
> 
> Modern gui applications tend to be event-driven, which makes
> sense; most modern desktop applications spend most of their time doing
> nothing and waiting for the user to generate input.  Games are
> different, in that they tend to be simulation-based, and things are
> happening regardless of whether the player is providing input.
> 
> In most games, you have to poll input between simulation ticks.
> If you accept and process an input event in the middle of a simulation
> tick, your simulation will likely be internally inconsistent.  Input
> in games typically moves or changes in-game objects, and if input
> affects an object mid-update, part of the simulation tick will have
> been calculated based on the old state of the object, and the rest
> will be based on the new state.
> 
> To deal with this on event-driven systems, games must either
> directly poll the input system, or else accumulate events and process
> them between simulation ticks.  Either works, but being able to poll
> means the game needs to do less work.

Wayland protocol in event driven. Polling does not make sense, since it
would mean a synchronous round-trip to the server, which for something
like this is just far too expensive, and easily (IMHO) worked around.

So, you have to maintain input state yourself, or by a library you use.
It could even be off-loaded to another thread.

There is also a huge advantage over polling: in an event driven design,
it is impossible to miss very fast, transient actions, which polling
would never notice. And whether you need to know if such a transient
happened, or how many times is happened, or how long time each
transient took between two game ticks, is all up to you and available.

I once heard about some hardcore gamer complaining, that in some
systems or under some conditions, probably related to the
ridiculous framerates gamers usually demand, the button sequenc

Re: Input and games.

2013-04-19 Thread Todd Showalter
On Fri, Apr 19, 2013 at 5:18 AM, Pekka Paalanen  wrote:

> I am going to reply from the Wayland protocol point of view, and what
> Wayland explicitly can (and must) do for you. This is likely much lower
> level than what a game programmer would like to use. How SDL or some
> other higher level library exposes input is a different matter, and I
> will not comment on that. We just want to make everything possible on
> the Wayland protocol level.

That's fair.  We don't use SDL in our projects, so I'm coming at
this partly from the point of view of someone who will be operating at
the protocol level.

> I do not think we can happily let client applications open input devices
> themselves, so this is clearly a thing we need to improve on. In other
> words, I believe we should come up with a protocol extension where the
> server opens the input devices, and either passes the file descriptor to
> a client, or the server translates evdev events into Wayland protocol
> events. "How" and "what" are still open questions, as is every other
> detail of input devices that are not keyboards, mice, or touchscreens.

This is certainly what I'd prefer, personally, whether it's a
file-descriptor based system, event messaging, or polling functions.
It would be really nice to get gamepads and the like in there, if
possible.

> There was once some talk about "raw input event protocol", but there is
> not even a sketch of it, AFAIK.

I'm not familiar enough with Wayland yet to take the lead on
something like that, but I can certainly help.

>> It would be really nice if there was some sort of configuration
>> that could be read so we'd know how the player wanted these things
>> mapped, and some sort of way for the player to set that configuration
>> up outside the game.
>
> Right, and whether this could be a Wayland thing or not, depends on the
> above, how to handle misc input devices in general.
>
> Keyboards already have extensive mapping capabilities. A Wayland server
> sends keycodes (I forget in which space exactly) and a keymap, and
> clients feed the keymap and keycodes into libxkbcommon, which
> translates them into something actually useful. Maybe something similar
> could be invented for game controllers? But yes, this is off-topic for
> Wayland, apart from the protocol of what event codes and other data to
> pass.

Fair enough.

> Wayland protocol in event driven. Polling does not make sense, since it
> would mean a synchronous round-trip to the server, which for something
> like this is just far too expensive, and easily (IMHO) worked around.
>
> So, you have to maintain input state yourself, or by a library you use.
> It could even be off-loaded to another thread.

This is what we do now, essentially; accumulate the incoming
events to assemble each frame's input device state.  It would be
convenient if Wayland did it for us, but obviously we're already
operating this way on X11, Win32 and OSX.

> There is also a huge advantage over polling: in an event driven design,
> it is impossible to miss very fast, transient actions, which polling
> would never notice. And whether you need to know if such a transient
> happened, or how many times is happened, or how long time each
> transient took between two game ticks, is all up to you and available.

In truth, we don't usually deal with pure polling at the low level
unless it's a game where we can guarantee that we're not going to drop
frames.  Even then, things like mouse, touch or stylus input can come
in way faster than vsync, and game simulation ticks are usually (for
relatively obvious reasons) timed to vsyncs.

In our engine, the input system has several parts, collected in a
per-player virtualized input structure.  It contains:

- analog axis
  - previous position
  - current position
  - delta (current - prev)
  - array of positions used to generate this frame's data

- buttons
  - previous frame state bitmap (1 bit per key/button)
  - current frame state bitmap
  - trigger bitmap (cur & ~prev)
  - release bitmap (prev & ~cur)
  - byte map of presses

If a key/button event was received since the last update, that key
or button is left down for at least one update, even if it went up
again before the snapshot went out.  If the game cares how many times
a button or key was pressed between updates, it can look the key up in
the byte map rather than the bitmap.

Likewise, while accumulated position/delta is usually good enough
for mouse/touch/stylus input and almost always good enough for
joystick input, there are times when you want to do things like
gesture recognition where it really pays to have the data at the
finest possible resolution.  Most parts of the game won't care, but
the data is there if it's needed.

> I once heard about some hardcore gamer complaining, that in some
> systems or under some conditions, probably related to the
> ridiculous framerates gamers usually demand, the button sequence he hits
> in a fractio

Re: Input and games.

2013-04-19 Thread Jonas Kulla
2013/4/19 Todd Showalter 

> On Fri, Apr 19, 2013 at 5:18 AM, Pekka Paalanen 
> wrote:
>

> > Event driven is a little more work for the "simple" games, but it gives
> > you guarantees. Would you not agree?
>
> We can definitely work with it.  As much as anything it's a
> question of convenience; the question is really how much
> superstructure we need to build on top to get what we need.  We've
> already got that superstructure elsewhere, so porting it over is
> simple enough.  It would be more convenient if we didn't have to, but
> it's not a deal breaker.
>
> For context, I'm not trying to convince you to change the protocol
> or the model per se; aside from anything else, I don't yet understand
> it well enough to seriously critique it.  A large part of what I'm
> hoping to do here is offer some insight into how games tend to use
> input, the kind of needs games often have, and the sorts of
> considerations that make a system easier or harder to put a game on.
> Wayland obviously has competing considerations, some of which are
> arguably more important than games.  If one can imagine such a thing.
>
> One thing worth noting here is why we want operate on virtualized
> input structures rather than raw events.  One reason I mentioned
> above; accumulating events so that they can be applied between frames.
>  Another reason is complexity management; games can be quite complex
> beasts consisting of many parts, and everything that can be done to
> isolate those parts makes the game easier to develop and maintain.
>
> The classic problem with a purely event-driven program is that
> somewhere in it there is a giant event loop that knows about
> everything in the program.  In something simple like a calculator,
> it's not a problem, but once you scale up to a large system with
> multiple subsystems the event loop can turn into a nightmare.  Having
> virtualized input structures that the game can query means that input
> tests can be isolated to the code where they belong. ie:
>
> if(KeyTrigger(KEY_D) && KeyDown(KEY_CTRL))
> {
>   Log("heap integrity %d\n", check_heap_integrity());
> }
>
> You can achieve some of the same modularity with function pointer
> lists or similar hooks, but having a virtualized input structure has
> (in my experience at least) been the cleanest abstraction.


I can totally see where you're coming from (having worked on a small
engine myself in the past), but I feel like creating static input states
should always be done on client side. Especially in Wayland where
"frame-perfectness" is crucial, round-trips such as input state polling
are strongly discouraged.
On the other hand, this doesn't mean that every developer has to
reinvent the wheel. Input state caching could certainly be split off
into a client library.


>
> > Is this referring to the problem of "oops, my mouse left the Quake
> > window when I tried to turn"? Or maybe more of "oops, the pointer hit
> > the monitor edge and I cannot turn any more?" I.e. absolute vs.
> > relative input events?
>
> Partly.  The issue is that *sometimes* a game wants the mouse and
> keyboard to behave in the standard way (ie: the mouse controls the
> pointer and lets you click gui elements, the keyboard is for entering
> text and hitting control keys) and *sometimes* the game wants the
> mouse motion to control an in-game object (often the camera) and just
> wants the keyboard and mouse buttons to be a big bag of digital
> buttons.  With the Quake example, when the pause menu is up, or when
> the terminal has been called down, the game wants the keyboard to be
> generating text commands on the terminal and the mouse to be able to
> select text and click on buttons.  When the terminal is gone and the
> game isn't paused, Quake wants the mouse to control the camera view
> and the keyboard WASD keys are emulating a game controller dpad.
>
> So, yes, absolute vs. relative events is part of the issue, but
> it's also part of a greater context; whether the keyboard is
> generating strings or digital inputs, whether the mouse is generating
> positions or deltas, under what circumstances focus is allowed to
> leave the window, whether the mouse pointer is visible, and things
> like how system-wide hotkeys factor in to things.  Can I capture the
> keyboard and mouse without preventing the user from using alt-tab to
> switch to another program, for instance?
>

Well, at least with the "relative motion" proposal Pekka linked,
such a grab would always be breakable by things such as AltTab.
AFAIK the general consensus is that Wayland clients should never ever
be able to "hold the display server hostage" as would often happen in X11
with faulty clients (ie. unable to leave a fullscreen hanging game).
So you shouldn't worry about that.


>
> Clean, fast switching between these states is part of it as well;
> in a game like Quake, as above, you want to be able to capture the
> mouse when the game is playing, but "uncapture" it 

Re: Input and games.

2013-04-19 Thread Todd Showalter
On Fri, Apr 19, 2013 at 1:52 PM, Jonas Kulla  wrote:

> I can totally see where you're coming from (having worked on a small
> engine myself in the past), but I feel like creating static input states
> should always be done on client side. Especially in Wayland where
> "frame-perfectness" is crucial, round-trips such as input state polling
> are strongly discouraged.
> On the other hand, this doesn't mean that every developer has to
> reinvent the wheel. Input state caching could certainly be split off
> into a client library.

That's perfectly workable; as I said elsewhere, we already do
essentially that on all other desktop environments, so it's not a
hardship.

> Well, at least with the "relative motion" proposal Pekka linked,
> such a grab would always be breakable by things such as AltTab.
> AFAIK the general consensus is that Wayland clients should never ever
> be able to "hold the display server hostage" as would often happen in X11
> with faulty clients (ie. unable to leave a fullscreen hanging game).
> So you shouldn't worry about that.

Ok, that's what I was hoping.

> I think the biggest reason for "relative motion" is to prevent having to
> warp
> the pointer, which wayland won't support. Everything else, ie. how your
> client interprets keyboard/scroll events, is up to you. I don't think it's
> wise to let compositors send "text" events, that's the reason the keymap is
> provided as a fd. And it's certainly not a good idea to tell them to
> suddenly
> provide different events for the same physical buttons/keys.

From a game point of view, when we want relative motion we often
don't want to see the pointer at all; the Quake example being an
example.  On the other hand, sometimes we want to park the pointer
somewhere (like the edge of a window) and use pointer motion as a
delta to apply to an object (like scrolling a map).

>> There's also the question of clean recovery; if a game has changed
>> the video mode (if that's allowed any more, though these days with LCD
>> panels and robust 3D hardware maybe that's just a bad idea), turned
>> off key repeat and captured the mouse, all of that needs to be
>> reverted if the game exits ungracefully.  Which sometimes happens,
>> especially during development.
>
> True. But as I mentioned above, what is even more critical than that is the
> ability to escape _hanged_ clients without having to reboot your PC.
> I think in that area the wayland devs are already cautious about doing the
> right thing. Global (compositor) hotkeys/shortcuts will never be able to
> be swallowed by clients AFAIK.

By "exits ungracefully" I mean "hanged", "died with sigsegv",
"died with sigbus", "deadlocked" or what have you.  Anything where the
game can no longer be relied upon to undo the state changes it
requested.

> Ok, now this seems REALLY confusing to me. In over 10 years of using
> computers I have never come across a single application behaving like that.
> Could you maybe name an example? Also, isn't this the reason mouse
> wheels were invented in the first place? I can see where you're coming
> from, but for me personally scroll bars have always been more of an
> "absolute" control widget, ie. I use them when I want to go exactly to
> the beginning/end/40% mark of a viewport.

I'm stuck on a Mac at the moment, unfortunately, so I can't give
you verified Linux examples, but I'm writing this email in the GMail
web client on Firefox.  If I grab the scroll thumb and drag it
upwards, the mouse pointer moves far more slowly than if I just moved
the mouse up without grabbing the thumb, and I see all of the text
scroll by.  If the thumb was moving at the speed my mouse normally
moves, I'd have jumped over 80% of the text.  LibreOffice is showing
me the same behavior, and I remember similar behavior lots of other
places.  IIRC it was also pretty much standard behavior on
well-behaved Win32 programs, at least from Win95 or so until XP; I've
managed to largely avoid Windows since then, so I couldn't tell you
what it does these days.

The thing is, with this model the thumb *position* is still
absolute; if you scroll the thumb to 40% of the way down the bar, the
view shows you 40% of the way down the document.  The difference is
that instead of warping the document past the view at the thumb's rate
of travel, the thumb's rate of travel is accelerated or attenuated to
match a sane document scrolling speed, and the pointer is warped to
keep it over the thumb.  It's a subtle effect, but I bet you'll find
that a lot of the software you use actually does it.  I've definitely
seen this built to respect pointer acceleration as well, so that if
you really do yank the mouse, the document just flies by.  But if
you're moving at a normal rate, the pointer moves slowly enough that
the document scrolls at a reasonable speed.

The important thing is, without that ability, you simply can't
scroll long documents sanely with a scroll bar thumb.  I suppose you
coul

Re: Input and games.

2013-04-19 Thread Todd Showalter
On Fri, Apr 19, 2013 at 2:55 PM, Todd Showalter  wrote:

> I suppose you could also handle this with pointer sensitivity
> modification; if you know the document:view ratio is 10:1, you could
> scale the pointer speed by 0.1 when the scroll wheel was grabbed while
> not allowing warping.

s/wheel/thumb/;  I fail at proofreading.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-19 Thread Bill Spitzak

Todd Showalter wrote:

On Fri, Apr 19, 2013 at 1:52 PM, Jonas Kulla  wrote:



I'm stuck on a Mac at the moment, unfortunately, so I can't give
you verified Linux examples, but I'm writing this email in the GMail
web client on Firefox.  If I grab the scroll thumb and drag it
upwards, the mouse pointer moves far more slowly than if I just moved
the mouse up without grabbing the thumb...


I have seen this before too, on Windows software.

I think this is going to require pointer warping. At first I thought it 
could be done by hiding the pointer and faking it's position, but that 
would not stop the invisible pointer from moving out of the window and 
becoming visible, or moving into a hot-spot and triggering an unexpected 
effect.



Also, I'm pretty sure something like mouse warping will likely never be
implemented in wayland for design reasons (you don't want malicious
apps controlling crucial variables such as pointer location).


This can be avoided by not allowing you to warp the pointer unless you 
have the seat focus. You may even require the pointer focus, though I 
think there are useful schemes where the client that has keyboard focus 
can move the related pointer).

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Daniel
El dv 19 de 04 de 2013 a les 14:55 -0400, en/na Todd Showalter va
escriure:

> From a game point of view, when we want relative motion we often
> don't want to see the pointer at all; the Quake example being an
> example.  On the other hand, sometimes we want to park the pointer
> somewhere (like the edge of a window) and use pointer motion as a
> delta to apply to an object (like scrolling a map).

This is useful for desktop software too. I'm thinking of Stellarium or
Google Earth, where moving the mouse is expected to move the
environment, not the pointer itself. 



___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Todd Showalter
On Sat, Apr 20, 2013 at 12:20 PM, Daniel  wrote:

> This is useful for desktop software too. I'm thinking of Stellarium or
> Google Earth, where moving the mouse is expected to move the
> environment, not the pointer itself.

"Games" is really perhaps shorthand here; there are a lot of tools
and so forth that have similar behavior and operating requirements to
games, but aren't strictly games per se.  If you have an architectural
walkthrough program that lets you navigate a building and make
alterations, that's not really something you'd call a game, but it is
operating under many of the same constraints.  It's more obvious in
things using 3D, but even the 2D side can use it in places.

I could easily see (for example) wanting to be able to do drag &
drop within a window on a canvas larger than the window can display;
say it's something like dia or visio or the like.  I drag an icon from
the sidebar into the canvas, and if it gets to the edge of the canvas
window the canvas scrolls and the dragged object (and the pointer)
parks at the window edge.

It's useful behavior.  I can definitely see why adding it to the
protocol makes things more annoying, but I've a strong suspicion it's
one of those things that if you leave it out you'll find that down the
road there's a lot of pressure to find a way to hack it in.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Nick Kisialiou
Todd,

Generic device input may be too complicated to put it into Wayland
protocol. For example, take Razer Hydra controller:
http://www.engadget.com/2011/06/08/razer-totes-hydra-sticks-and-6400dpi-dual-sensor-mice-to-e3-2011/

There are 2 USB connected controllers for each hand, each with 6 DOF
information for 3D position and 3D rotation information. I programmed it
for a 3D environment rather than games. Each controller sends you a
quaternion to extract the data. On top of it, the output is noisy, so you'd
want to add filters to integrate the noise out.

The last thing I'd want is to have a middleman between the USB port and my
processing code that messes around with rotation matrices and introduces
delays. I think it is reasonable to limit the protocol to mice like devices
only. As long as the protocol allows 2 mice simultaneously in the system
(which they do), IMHO, the rest of the processing is better placed within
your own code.

Nick


On Sat, Apr 20, 2013 at 9:38 AM, Todd Showalter wrote:

> On Sat, Apr 20, 2013 at 12:20 PM, Daniel  wrote:
>
> > This is useful for desktop software too. I'm thinking of Stellarium or
> > Google Earth, where moving the mouse is expected to move the
> > environment, not the pointer itself.
>
> "Games" is really perhaps shorthand here; there are a lot of tools
> and so forth that have similar behavior and operating requirements to
> games, but aren't strictly games per se.  If you have an architectural
> walkthrough program that lets you navigate a building and make
> alterations, that's not really something you'd call a game, but it is
> operating under many of the same constraints.  It's more obvious in
> things using 3D, but even the 2D side can use it in places.
>
> I could easily see (for example) wanting to be able to do drag &
> drop within a window on a canvas larger than the window can display;
> say it's something like dia or visio or the like.  I drag an icon from
> the sidebar into the canvas, and if it gets to the edge of the canvas
> window the canvas scrolls and the dragged object (and the pointer)
> parks at the window edge.
>
> It's useful behavior.  I can definitely see why adding it to the
> protocol makes things more annoying, but I've a strong suspicion it's
> one of those things that if you leave it out you'll find that down the
> road there's a lot of pressure to find a way to hack it in.
>
> Todd.
>
> --
>  Todd Showalter, President,
>  Electron Jump Games, Inc.
> ___
> wayland-devel mailing list
> wayland-devel@lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
>
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Todd Showalter
On Sat, Apr 20, 2013 at 5:13 PM, Nick Kisialiou  wrote:

> There are 2 USB connected controllers for each hand, each with 6 DOF
> information for 3D position and 3D rotation information. I programmed it for
> a 3D environment rather than games. Each controller sends you a quaternion
> to extract the data. On top of it, the output is noisy, so you'd want to add
> filters to integrate the noise out.

There's definitely crazy stuff out there, and wouldn't argue for
an input abstraction that deals with all of it.  What I would argue,
however, is that the console gamepad is now standardized enough to
qualify for system support.  If you consider:

- the PS2 Dual Shock controller
- the PS3 Dual Shock controller
- the XBox controller
- the XBox 360 controller
- the GameCube controller
- the Wii "Classic" controller
- most PC gamepads

We've got a pretty common set of functionality.  Two analog sticks
(left and right thumb), a dpad, some analog shoulder triggers and a
set of buttons.  There may be other functionality (lights, vibration,
pressure sensors, tilt sensors), and that extended functionality is
almost certainly beyond the scope of Wayland, but it would be nice if
there was at least basic support for the common case.

By analogy, Razer also makes this:

http://www.razerzone.com/gaming-mice/razer-naga-epic/

It's a mouse with 17 buttons and customizable lighting.  I'd
expect Wayland to support that as a mouse, but I wouldn't expect it to
understand all the extra buttons or the software-controllable LED
lights.  It would be nice if there was a way to get at the extra
functionality, but for special-case devices like these I think it's
reasonable to expect that support to be external to Wayland.

Likewise, I'd be content with support for the common things in
gamepads, with non-common elements like force feedback, tilt sensors,
lights, pressure sensitive buttons and so forth relegated to second
class (external library) status.  Ideally it would be wonderful to
support all the features of every oddball controller, but I'd far
rather have easy support for standard gamepads than the mess we have
now.

The data just needs to be something like:

typedef struct
{
  float x;

} VEC2;

typedef struct
{
} GAMEPAD_DATA;


Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Todd Showalter
On Sat, Apr 20, 2013 at 10:31 PM, Todd Showalter  wrote:

I hate it when I fat-finger send.

The data just needs to be something like:

typedef struct
{
   float x;
   float y;
} VEC2;

typedef struct
{
  VEC2 l_stick;
  VEC2 r_stick;
  floatl_shoulder;
  floatr_shoulder;
  uint64_t  buttons;
} GAMEPAD_DATA;

That's a complete controller state in 32 bytes.  The analog values
in actual hardware are usually actually returned as unsigned byte
values, but from a protocol point of view converting each stick axis
to the range [-1.0f .. 1.0f] and the triggers to [0.0f .. 1.0f] is
saner.  If Wayland just delivers that data it will cover most needs.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-20 Thread Todd Showalter
On Fri, Apr 19, 2013 at 7:08 PM, Bill Spitzak  wrote:

> I think this is going to require pointer warping. At first I thought it
> could be done by hiding the pointer and faking it's position, but that would
> not stop the invisible pointer from moving out of the window and becoming
> visible, or moving into a hot-spot and triggering an unexpected effect.

I think edge resistance/edge snapping really wants pointer warping as well.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-21 Thread Martin Minarik
Hello Todd


> That's a complete controller state in 32 bytes. The
analog values
> in actual hardware are usually actually returned as
unsigned byte
> values, but from a protocol point of view converting
each stick axis
> to the range [-1.0f .. 1.0f] and the triggers to [0.0f
.. 1.0f] is
> saner.  If Wayland just delivers that data it will cover
most needs.

I disagree with inventing a new protocol. The joypad
devices are generally already supported in the linux
kernel[1].

Pretty much all that device ever does, is to send key and
abs data.

The protocol is similiar to evdev. We may be able to send
it over
the wayland protocol to the applications in a
straightforward way.

A special device wl_joypad will be created that will
support button
and abs axis events. It will be delivered to application
that subscribe
to this type of protocol. What do you think?

References:

[1] linux/input/input/joystick/xpad.c


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-21 Thread John-John Tedro
As a hobby game dev, what follows is a description of my dream input system.

Apart from the existing input protocol...

I would expect Wayland to support a protocol that can enumerate input
devices and discover capabilities regardless of type. What follows that
wayland has a sensible set of i/o primitives (delta, vector, absolute,
quaternion, buttonset, rumble).
These should be coupled with metadata that makes applications display
sensible information about keybindings (ex. movement format mapped to Left
Stick Y).

The compositor in turn might require a whole slew of drivers (or very few)
to support all the different variations out there under this protocol.
Conflicts and priority needs to be handled since one driver might be very
generic (generic gamepad) while another specialized (Xbox 360 Controller)

Now I have little to no idea how an input model like this would manifest
itself in a library like SDL.

My five cents.


On Sun, Apr 21, 2013 at 2:14 PM, Martin Minarik <
minari...@student.fiit.stuba.sk> wrote:

> Hello Todd
>
>
> > That's a complete controller state in 32 bytes. The
> analog values
> > in actual hardware are usually actually returned as
> unsigned byte
> > values, but from a protocol point of view converting
> each stick axis
> > to the range [-1.0f .. 1.0f] and the triggers to [0.0f
> .. 1.0f] is
> > saner.  If Wayland just delivers that data it will cover
> most needs.
>
> I disagree with inventing a new protocol. The joypad
> devices are generally already supported in the linux
> kernel[1].
>
> Pretty much all that device ever does, is to send key and
> abs data.
>
> The protocol is similiar to evdev. We may be able to send
> it over
> the wayland protocol to the applications in a
> straightforward way.
>
> A special device wl_joypad will be created that will
> support button
> and abs axis events. It will be delivered to application
> that subscribe
> to this type of protocol. What do you think?
>
> References:
>
> [1] linux/input/input/joystick/xpad.c
>
>
> ___
> wayland-devel mailing list
> wayland-devel@lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
>
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-21 Thread Todd Showalter
On Sun, Apr 21, 2013 at 8:14 AM, Martin Minarik
 wrote:

> I disagree with inventing a new protocol. The joypad
> devices are generally already supported in the linux
> kernel[1].

They are, but from a developer point of view it's not pretty; it's
been a while since the last time I messed with evdev (basically after
getting frustrated with the limitations of libjsw), but IIRC:

- you need elevated permission to open the device -- risky in
something as complex as a game, especially when it may well be running
mods, scripts or data from untrusted sources (ie: found on a forum)

- the data comes in as EV_KEY, EV_REL and EV_ABS messages, which is
fine, but I found no programmatic way to discover the structure of the
device -- I don't know how relative axis 3 being set to 0.5 maps onto
what the player is doing with the actual device

So, not ideal.  It's a decent basis for a higher level system,
assuming there's some way to query device type and determine how the
various parts of the device actually map onto the physical controller,
but something has to supply that higher level system, and right now
it's the game.

The caveat here is that I last touched libjsw and evdev a few
years back; perhaps there have been improvements or someone wrote the
higher level support code and I missed it.

On the permissions side of things, though, the ideal is that a
game be able to setuid to something with nearly no permissions at all
and still be able to receive gamepad, keyboard and mouse messages.
Not all games will want to do that; some things will want more
filesystem access than just a save directory, some games will want to
be able to bind network ports and so forth.  But ideally, for a game
that is local-only and only needs to access a small directory to save
its state, it should be able to have it's own unprivileged userid that
it runs as without sacrificing gamepad input.

> The protocol is similiar to evdev. We may be able to send
> it over the wayland protocol to the applications in a
> straightforward way.

This is workable, but identifying how the parts map to the device
is still a problem.  With a keyboard, you get keysyms that tell you
what keys the messages correspond to.  With a mouse, you get an
explicitly labelled 2d vector for the pointer and numbered buttons in
a mask that you can directly map onto the hardware without worry.
With gamepads, unless things have changed, the axis and button values
come in unchanged from the order they were on the hardware, which
means that you have no idea which control a given button or axis maps
to.

That's the reason I'm suggesting a standardized layout, since
devices now make that feasible.  That's not to say that the standard
layout needs to be transmitted with every message.  I'd be perfectly
content if it was still component-based messages, as long as enough
information was available.  To put it in (cut down) X11 terms:

typedef struct {
   [... protocol metadata ...]

   int type; // press or release
   unsigned int keycode; // detail
} GamepadKeyEvent;

typedef struct {
   [... protocol metadata ...]

  int type; // stick
  int index; // 0 is left stick, 1 is right stick
  float x, y; // deflection [-1.0f .. 1.0f]
} GamepadStickEvent;

typedef struct {
  [... protocol metadata ...]

  int type; // axis
  int index; // 0 is left trigger, 1 is right trigger
  float x; // deflection [0.0f .. 1.0f]
} GamepadAxisEvent;

So, it could support arbitrary controllers with more than 2
sticks, but the user would be guaranteed that the first two sticks map
to the standard left and right sticks on the player's game controller.
 Likewise, it could support more than two single-axis values, but the
common case is standardized.  Most games won't need any extra glue
code to work with a scheme like this.

The fun bit is the keysyms for the buttons.  Everyone labels their
buttons differently.  While most controllers have a "start" button
somewhere, the face buttons (ie: the ones games usually want to tell
you to press...) are a mess.  Clockwise from the top, we have:

PS*: green triangle, red circle, blue x, pink square
XBox: yellow Y, red B, green A, blue X
Wii Classic: solid x, clear a, clear b, solid y
Ouya: yellow Y, red A, green O, blue U

So, I think a sane protocol can assume four face buttons, since
that seems to be standard since Sega stopped making hardware, but
they'll need  hardware independent names; top/bottom/left/right or
compass directions or the like.  The "start" button maps across
hardware.  Some systems have "home" or "select" buttons that can be
mapped, and most systems have shoulder triggers that can activate as
buttons.  Many of the systems also have "click" on their analog
sticks, and obviously dpads should map cleanly across hardware.

So, yes, an event-based system that delivers gamepad events
piecewise is perfectly workable, but for it to be useful to games it
needs to provide some way for the game to map that informatio

Re: Input and games.

2013-04-22 Thread Pekka Paalanen
Hi Todd,

Jonas Kulla already replied on several items, but it's easier for
me to comment on everything I have something to say, so pardon if I
repeat some things.


On Fri, 19 Apr 2013 12:31:19 -0400
Todd Showalter  wrote:

> On Fri, Apr 19, 2013 at 5:18 AM, Pekka Paalanen  wrote:
> 
> > I am going to reply from the Wayland protocol point of view, and what
> > Wayland explicitly can (and must) do for you. This is likely much lower
> > level than what a game programmer would like to use. How SDL or some
> > other higher level library exposes input is a different matter, and I
> > will not comment on that. We just want to make everything possible on
> > the Wayland protocol level.
> 
> That's fair.  We don't use SDL in our projects, so I'm coming at
> this partly from the point of view of someone who will be operating at
> the protocol level.

The protocol level is not convenient for an application developer in
some cases, and it's not even meant to be. We explicitly leave lots of
processing for a so-called toolkit library. At several points below I
say that something is out of the scope of libwayland-client, and my
above comment was just a fore-warning about that. :-)

> > I do not think we can happily let client applications open input devices
> > themselves, so this is clearly a thing we need to improve on. In other
> > words, I believe we should come up with a protocol extension where the
> > server opens the input devices, and either passes the file descriptor to
> > a client, or the server translates evdev events into Wayland protocol
> > events. "How" and "what" are still open questions, as is every other
> > detail of input devices that are not keyboards, mice, or touchscreens.
> 
> This is certainly what I'd prefer, personally, whether it's a
> file-descriptor based system, event messaging, or polling functions.
> It would be really nice to get gamepads and the like in there, if
> possible.
> 
> > There was once some talk about "raw input event protocol", but there is
> > not even a sketch of it, AFAIK.
> 
> I'm not familiar enough with Wayland yet to take the lead on
> something like that, but I can certainly help.
> 
> >> It would be really nice if there was some sort of configuration
> >> that could be read so we'd know how the player wanted these things
> >> mapped, and some sort of way for the player to set that configuration
> >> up outside the game.
> >
> > Right, and whether this could be a Wayland thing or not, depends on the
> > above, how to handle misc input devices in general.
> >
> > Keyboards already have extensive mapping capabilities. A Wayland server
> > sends keycodes (I forget in which space exactly) and a keymap, and
> > clients feed the keymap and keycodes into libxkbcommon, which
> > translates them into something actually useful. Maybe something similar
> > could be invented for game controllers? But yes, this is off-topic for
> > Wayland, apart from the protocol of what event codes and other data to
> > pass.
> 
> Fair enough.

In other emails, it seems you are really looking for a mapping
library for gamepads and joysticks, at least for the usual devices.
While the Wayland protocol should support this, I do not think it
is in scope for Wayland to actually define it. As with keyboards,
the Wayland protocol allows passing keymaps around, and one type of
keymaps is what xkbcommon uses. xkbcommon then actually defines the
mappings, symbols, some state tracking, etc. which Wayland does not.

Such a mapping library and standard should be started as a separate
project, initially building on top of evdev directly. When that
works, we can come up with the Wayland protocol extension to
support that.

> > Wayland protocol in event driven. Polling does not make sense, since it
> > would mean a synchronous round-trip to the server, which for something
> > like this is just far too expensive, and easily (IMHO) worked around.
> >
> > So, you have to maintain input state yourself, or by a library you use.
> > It could even be off-loaded to another thread.
> 
> This is what we do now, essentially; accumulate the incoming
> events to assemble each frame's input device state.  It would be
> convenient if Wayland did it for us, but obviously we're already
> operating this way on X11, Win32 and OSX.

If by Wayland here you mean libwayland-client, then no, it is out
of scope. libwayland-client is only a protocol binding for C, more
alike libxcb than xlib, if I have understood them right. It does
not keep state; that is intended for higher level libraries or
toolkits.

> > There is also a huge advantage over polling: in an event driven design,
> > it is impossible to miss very fast, transient actions, which polling
> > would never notice. And whether you need to know if such a transient
> > happened, or how many times is happened, or how long time each
> > transient took between two game ticks, is all up to you and available.
> 
> In truth, we don't usually deal with 

Re: Input and games.

2013-04-22 Thread Todd Showalter
On Mon, Apr 22, 2013 at 3:41 AM, Pekka Paalanen  wrote:

> The protocol level is not convenient for an application developer in
> some cases, and it's not even meant to be. We explicitly leave lots of
> processing for a so-called toolkit library. At several points below I
> say that something is out of the scope of libwayland-client, and my
> above comment was just a fore-warning about that. :-)

I'm aware; that said, our engine work on consoles as well, and
we're used to working close to the metal.  We may use intermediary
libraries for some things, but odds are good that for input and
graphics, at least for our engine, we'll be talking directly to (I
presume) libwayland-client.

> In other emails, it seems you are really looking for a mapping
> library for gamepads and joysticks, at least for the usual devices.
> While the Wayland protocol should support this, I do not think it
> is in scope for Wayland to actually define it. As with keyboards,
> the Wayland protocol allows passing keymaps around, and one type of
> keymaps is what xkbcommon uses. xkbcommon then actually defines the
> mappings, symbols, some state tracking, etc. which Wayland does not.

The argument I've been trying to make is that there's a set of
gamepad functionality that is very analogous to the "standard" set of
mouse functionality.  IIRC the standard mouse functionality is
generally accepted to be defined by the USB HID spec.  My argument is
that there is a basic set of gamepad functionality (two analog sticks,
a dpad, two analog triggers, some buttons) that have become a de facto
standard such that they merit support.

This is the kind of thing that will (hopefully) be more important
as people start building living room gaming PCs; I can't overstate the
importance of having basic game input "just work".

I'd argue that the mouse analogy is more relevant than the
keyboard analogy; there are features present on only some keyboards
(mode shift keys, alt-gr...) that are essential to the input methods
those keyboards use.  The keyboard system needs to support a fairly
extensive set of differing hardware in order to do its job at all,
unless we're going to slink back to the 7 bit ASCII days.

Gamepads, by contrast, are all mostly the same these days, much
like mice.  You can find oddball ones like that PC gamepad that was up
on Kickstarter recently which had a trackball in place of the right
thumb stick, but the core gamepad is now every bit as standardized as
the core mouse.

> Such a mapping library and standard should be started as a separate
> project, initially building on top of evdev directly. When that
> works, we can come up with the Wayland protocol extension to
> support that.

Hmm.  Who would I talk to about getting this started?

>> Which reminds me; it would be extremely useful to be able to shut
>> off key repeat for a specific client (ie: a game) without having to
>> shut it off globally.
>
> I believe key repeat is implemented client-side, so there is
> nothing to switch off. I think whether a key repeats or not depends
> also on the keymap, which the server does not process on clients'
> behalf. Instead, clients are handed the keymap and raw key values,
> and expected to do the right thing. (This is yet another thing left
> for toolkits.)

The problem in the past here has been that in a game, you don't
typically want the keyboard to autorepeat.  In other systems (X11, for
one, IIRC) key repeat was a global setting; if you turned it off for
your program, you turned it off for everything.  So, suddenly key
repeat doesn't work in the terminal or your editor either.

What would be nice is if there were a clean mechanism whereby a
program could say "here's how I need my input conditioned", and only
that program would be affected.

> There is actually a design principle behind having key repeat in the
> client side. Real user input always corresponds to some physical action
> of the user, happening at a point in time. The kernel input drivers
> tell us the timestamp. All Wayland protocol events corresponding to
> real user input have that timestamp. Key repeat events would not be
> real user input events.

IIRC some autorepeat is actually hardware generated; I thought
that was something living in the BIOS, though maybe that's no longer
true.

> Furthermore, we do not fake input events. If we wanted to support e.g.
> pointer warping, we would probably need a new event to tell clients
> that the pointer was moved by something else than the user, and it
> would not have a timestamp (since we cannot assume that we can fake
> a meaningful timestamp).

I would have thought that for pointer warping specifically that
it's one of those cases where the right thing to do is have a server
round-trip; the program with focus requests pointer warp, and the
server comes back either with a mouse move event that has been
suitably flagged, or with a warp event.  Aside from anything else,
that means:


Re: Input and games.

2013-04-22 Thread Pekka Paalanen
On Mon, 22 Apr 2013 10:42:35 -0400
Todd Showalter  wrote:

> On Mon, Apr 22, 2013 at 3:41 AM, Pekka Paalanen  wrote:
> 
> > The protocol level is not convenient for an application developer in
> > some cases, and it's not even meant to be. We explicitly leave lots of
> > processing for a so-called toolkit library. At several points below I
> > say that something is out of the scope of libwayland-client, and my
> > above comment was just a fore-warning about that. :-)
> 
> I'm aware; that said, our engine work on consoles as well, and
> we're used to working close to the metal.  We may use intermediary
> libraries for some things, but odds are good that for input and
> graphics, at least for our engine, we'll be talking directly to (I
> presume) libwayland-client.
> 
> > In other emails, it seems you are really looking for a mapping
> > library for gamepads and joysticks, at least for the usual devices.
> > While the Wayland protocol should support this, I do not think it
> > is in scope for Wayland to actually define it. As with keyboards,
> > the Wayland protocol allows passing keymaps around, and one type of
> > keymaps is what xkbcommon uses. xkbcommon then actually defines the
> > mappings, symbols, some state tracking, etc. which Wayland does not.
> 
> The argument I've been trying to make is that there's a set of
> gamepad functionality that is very analogous to the "standard" set of
> mouse functionality.  IIRC the standard mouse functionality is
> generally accepted to be defined by the USB HID spec.  My argument is
> that there is a basic set of gamepad functionality (two analog sticks,
> a dpad, two analog triggers, some buttons) that have become a de facto
> standard such that they merit support.
> 
> This is the kind of thing that will (hopefully) be more important
> as people start building living room gaming PCs; I can't overstate the
> importance of having basic game input "just work".
> 
> I'd argue that the mouse analogy is more relevant than the
> keyboard analogy; there are features present on only some keyboards
> (mode shift keys, alt-gr...) that are essential to the input methods
> those keyboards use.  The keyboard system needs to support a fairly
> extensive set of differing hardware in order to do its job at all,
> unless we're going to slink back to the 7 bit ASCII days.
> 
> Gamepads, by contrast, are all mostly the same these days, much
> like mice.  You can find oddball ones like that PC gamepad that was up
> on Kickstarter recently which had a trackball in place of the right
> thumb stick, but the core gamepad is now every bit as standardized as
> the core mouse.

Alright, do you really mean that the controls are as standard as
mouse buttons and wheels, and we would not need a per-device-model
database? If so, then sure, a mouse-like Wayland protocol would
indeed be possible.

> > Such a mapping library and standard should be started as a separate
> > project, initially building on top of evdev directly. When that
> > works, we can come up with the Wayland protocol extension to
> > support that.
> 
> Hmm.  Who would I talk to about getting this started?

I'm not sure. If you're looking for volunteers, just throwing the
idea out in public is a start, but to have some chances of
succeeding, you probably need to start the work yourself, or pay
someone to do it. If it turns out good, other projects might start
using it, and also contributing.

But as per above, maybe we really don't need it?

> >> Which reminds me; it would be extremely useful to be able to shut
> >> off key repeat for a specific client (ie: a game) without having to
> >> shut it off globally.
> >
> > I believe key repeat is implemented client-side, so there is
> > nothing to switch off. I think whether a key repeats or not depends
> > also on the keymap, which the server does not process on clients'
> > behalf. Instead, clients are handed the keymap and raw key values,
> > and expected to do the right thing. (This is yet another thing left
> > for toolkits.)
> 
> The problem in the past here has been that in a game, you don't
> typically want the keyboard to autorepeat.  In other systems (X11, for
> one, IIRC) key repeat was a global setting; if you turned it off for
> your program, you turned it off for everything.  So, suddenly key
> repeat doesn't work in the terminal or your editor either.
> 
> What would be nice is if there were a clean mechanism whereby a
> program could say "here's how I need my input conditioned", and only
> that program would be affected.

Looking at Weston, it seems to do the extra effort to ensure that
it does not send repeats to clients.

> > There is actually a design principle behind having key repeat in the
> > client side. Real user input always corresponds to some physical action
> > of the user, happening at a point in time. The kernel input drivers
> > tell us the timestamp. All Wayland protocol events corresponding to
> > real user input hav

Re: Input and games.

2013-04-22 Thread Todd Showalter
On Mon, Apr 22, 2013 at 1:40 PM, Pekka Paalanen  wrote:

>> Gamepads, by contrast, are all mostly the same these days, much
>> like mice.  You can find oddball ones like that PC gamepad that was up
>> on Kickstarter recently which had a trackball in place of the right
>> thumb stick, but the core gamepad is now every bit as standardized as
>> the core mouse.
>
> Alright, do you really mean that the controls are as standard as
> mouse buttons and wheels, and we would not need a per-device-model
> database? If so, then sure, a mouse-like Wayland protocol would
> indeed be possible.

What I mean is that in practice, the difference between game
controllers is almost entirely two things; which particular bit in the
button mask gets set/cleared by any particular button, and which axis
maps to which control.  Right now (unless things have changed), for
example, if you plug in an xbox 360 controller:

- left stick is axis (0, 1)
- left trigger is axis 2
- right stick is axis (3, 4)
- right trigger is axis 5
- dpad is axis (6, 7)

I had to determine that by logging values and playing with the
controls to see what made what numbers move.  The xbox controller (ie:
not 360) may be the same order, it may not be.  The Dual Shock
controller may be the same order, it likely isn't.  So unless we're
really lucky something has to convert the axis ordering to a canonical
form.

Likewise, the buttons are just indexed, since as far as I can tell
without cracking open the code, JSGetButtonState() is just:

return (buttons & (1 << index));

I'd vastly prefer keysyms here; I don't want to have to go look up
which button is START on this controller, or have to figure out which
index is the bottom right face button.

So, some layer needs to translate buttons to keysyms, and adjust
the axis ordering (and possibly scaling) to fit the canonical
controller model, which I would suggest essentially be two analog
sticks, two analog triggers, plus keys (where the four dpad directions
are keys).  The translation layer needn't be very complex; as long as
there's some way to query evdev or the underlying system to find out
exactly what kind of device this is, it's a simple matter of per-axis
scale, offset and index translation (ie: scale this axis by -0.5f, add
1.0f, map to left trigger) and a list of bit to keysym lookups.

So, in terms of hardware capabilities, there is very much a
standard.  In terms of how that hardware is presented to the system
over USB, the same data is all there, but your guess is as good as
mine with regards to ordering.  Which is the problem I'd really like
to see solved.

>> Hmm.  Who would I talk to about getting this started?
>
> I'm not sure. If you're looking for volunteers, just throwing the
> idea out in public is a start, but to have some chances of
> succeeding, you probably need to start the work yourself, or pay
> someone to do it. If it turns out good, other projects might start
> using it, and also contributing.
>
> But as per above, maybe we really don't need it?

I'm hoping not.  If it is needed, I think it's going to have to
sit between Wayland and evdev; part of the point of this (to me, at
least) is to isolate the game from the kind of permissions that you
require to open evdev devices.  Or for that matter, isolate the player
from having to edit their udev rules just to get a gamepad working.

> Looking at Weston, it seems to do the extra effort to ensure that
> it does not send repeats to clients.

Excellent.

>> I would have thought that for pointer warping specifically that
>> it's one of those cases where the right thing to do is have a server
>> round-trip; the program with focus requests pointer warp, and the
>> server comes back either with a mouse move event that has been
>> suitably flagged, or with a warp event.  Aside from anything else,
>> that means:
>>
>> - the warp event isn't official until it has been blessed
>> - the warp event can be timestamped sanely
>> - the server has the option to modify or reject the warp
>
> Still, I think would be problematic. If a client continuously warps
> the pointer to the middle of its window, getting away from that
> would be difficult, and I can't see any heuristic the compositor
> could use to prevent that.
>
> Granted, it's not that different from pointer lock. I just believe
> that arbitrary pointer warping is disrupting to a user, and we need
> to limit it to special cases, like pointer lock. Even just
> requiring keyboard focus to be able to warp goes a long way.

That's the thing; as long as the user has a way of taking focus
away from the program (alt-tab or whatever), the malicious things you
can do with pointer warping are *exactly* the malicious things you can
do with pointer lock, and they can be escaped/broken the same way.

Personally, I can't stand things like warp-cursor-to-popup-window;
it drives me nuts, and often as not it steals focus when I'm in the
middle of something and misu

Re: Input and games.

2013-04-23 Thread Pekka Paalanen
On Mon, 22 Apr 2013 15:32:50 -0400
Todd Showalter  wrote:

> On Mon, Apr 22, 2013 at 1:40 PM, Pekka Paalanen 
> wrote:
> 
> >> Gamepads, by contrast, are all mostly the same these days, much
> >> like mice.  You can find oddball ones like that PC gamepad that
> >> was up on Kickstarter recently which had a trackball in place of
> >> the right thumb stick, but the core gamepad is now every bit as
> >> standardized as the core mouse.
> >
> > Alright, do you really mean that the controls are as standard as
> > mouse buttons and wheels, and we would not need a per-device-model
> > database? If so, then sure, a mouse-like Wayland protocol would
> > indeed be possible.
> 
> What I mean is that in practice, the difference between game
> controllers is almost entirely two things; which particular bit in the
> button mask gets set/cleared by any particular button, and which axis
> maps to which control.  Right now (unless things have changed), for
> example, if you plug in an xbox 360 controller:
> 
> - left stick is axis (0, 1)
> - left trigger is axis 2
> - right stick is axis (3, 4)
> - right trigger is axis 5
> - dpad is axis (6, 7)
> 
> I had to determine that by logging values and playing with the
> controls to see what made what numbers move.  The xbox controller (ie:
> not 360) may be the same order, it may not be.  The Dual Shock
> controller may be the same order, it likely isn't.  So unless we're
> really lucky something has to convert the axis ordering to a canonical
> form.
> 
> Likewise, the buttons are just indexed, since as far as I can tell
> without cracking open the code, JSGetButtonState() is just:
> 
> return (buttons & (1 << index));
> 
> I'd vastly prefer keysyms here; I don't want to have to go look up
> which button is START on this controller, or have to figure out which
> index is the bottom right face button.
> 
> So, some layer needs to translate buttons to keysyms, and adjust
> the axis ordering (and possibly scaling) to fit the canonical
> controller model, which I would suggest essentially be two analog
> sticks, two analog triggers, plus keys (where the four dpad directions
> are keys).  The translation layer needn't be very complex; as long as
> there's some way to query evdev or the underlying system to find out
> exactly what kind of device this is, it's a simple matter of per-axis
> scale, offset and index translation (ie: scale this axis by -0.5f, add
> 1.0f, map to left trigger) and a list of bit to keysym lookups.
> 
> So, in terms of hardware capabilities, there is very much a
> standard.  In terms of how that hardware is presented to the system
> over USB, the same data is all there, but your guess is as good as
> mine with regards to ordering.  Which is the problem I'd really like
> to see solved.

Hi Todd,

what you describe here is very much a keymap-like database for game
controllers: translating from button and axis indices to labels or
symbols. However, having a brief chat with Daniel Stone, it seems we
should not need these.

Take a look at /usr/include/linux/input.h

There you find definitions for BTN_A, BTN_X, BTN_START, ABS_X, ABS_Y,
ABS_RX, ABS_RY, ABS_HATnn, and many more. The kernel evdev interface
should alreay be giving out events with the correct label, so we would
not need any mapping.

Are you saying that the kernel gives out the labels wrong? If so, this
should be fixed in the kernel drivers. One thing less to care about in
Wayland. We "just" need to write the protocol for these devices, the
labels should be already there.

The current behaviour can be checked with evtest:
http://cgit.freedesktop.org/evtest/
Was that what you used to check the controller events?

> >> >> If the events are just coming in as a pile in 60Hz ticks,
> >> >> it's all good and we can do everything we need to.  If they're
> >> >> coming in as a pile at 10Hz ticks, it's going to be difficult
> >> >> to make anything more active than Solitaire.
> >> >
> >> > Yes as far as I understand, currently input events are sent to
> >> > Wayland clients as a burst at every compositor repaint cycle,
> >> > which happens at the monitor refresh rate, so for a 60 Hz
> >> > monitor, you would be getting them in bursts at 60 Hz.
> >>
> >> That's as good as we get from consoles, and besides, there
> >> isn't much point in most games in running your simulator at a
> >> higher frequency than the refresh rate.
> >
> > What about these hardcore fps-gamers who want at least 200+ frames
> > per second, or they can't win? :-)
> 
> If you're sending me vsync at 200Hz, I'll gladly update my
> simulation at 200Hz and chew input at 200Hz.  :)
> 
> Most players are playing on 60Hz refresh monitors, and those LCD
> monitors have enough lag on them that it really doesn't matter if the
> simulation ticks are happening (and eating input) faster than that.
> Even if you react at the speed of light (literally), you're
> interacting with data where he simulation probably sta

Re: Input and games.

2013-04-23 Thread Todd Showalter
On Tue, Apr 23, 2013 at 7:25 AM, Pekka Paalanen  wrote:

> what you describe here is very much a keymap-like database for game
> controllers: translating from button and axis indices to labels or
> symbols. However, having a brief chat with Daniel Stone, it seems we
> should not need these.
>
> Take a look at /usr/include/linux/input.h
>
> There you find definitions for BTN_A, BTN_X, BTN_START, ABS_X, ABS_Y,
> ABS_RX, ABS_RY, ABS_HATnn, and many more. The kernel evdev interface
> should alreay be giving out events with the correct label, so we would
> not need any mapping.
>
> Are you saying that the kernel gives out the labels wrong? If so, this
> should be fixed in the kernel drivers. One thing less to care about in
> Wayland. We "just" need to write the protocol for these devices, the
> labels should be already there.

I'm not saying the labels are wrong; I assume they are correct.
The problem is that the labels are hardware-specific, at least for the
buttons.  That said, it looks like the axis values are being properly
labelled, which means we're way closer to sane behaviour than we were
last time I looked into this.

I grabbed evtest and ran it with three devices; an xbox
controller, an xbox 360 controller, and a ps3 controller.  The
results:

xbox:
- button
   A B X Y START THUMBL THUMBR
   Z (white button)
   C (black button)
   SELECT (back button)
- axis
   ABS_X ABS_Y ABS_Z
   ABS_RX ABS_RY ABS_RZ
   ABS_HAT0X ABS_HAT0Y (dpad)

xbox 360:
- button
   A B X Y START THUMBL THUMBR
   TL  TR
   MODE (home button)
   SELECT (back button)
- axis
   ABS_X ABS_Y ABS_Z
   ABS_RX ABS_RY ABS_RZ
   ABS_HAT0X ABS_HAT0Y (dpad)

ps3:
  - button
TRIGGER  THUMB  THUMB2
TOP TOP2 PINKIE BASE  BASE2
BASE3  BASE3  BASE4  BASE5
BASE6  DEAD  TRIGGER_HAPPY17
TRIGGER_HAPPY18  TRIGGER_HAPPY19
  - axis
ABS_X  ABS_Y  ABS_Z  ABS_RZ  ABS_MISC

The xbox controller and the xbox 360 controller are more or less
the same; the 360 controller has a couple of shoulder buttons instead
of a the black and white buttons, and (somewhat oddly) the "back"
buttons come in as "select", but that's workable.

It all rather goes pear-shaped when we get beyond that, though.
The PS3 controller, while physically quite similar to the other two,
even down to the placement of controls and how the controls are
clustered, comes in completely differently.  There is not a single
button in common between the PS3 controller and the XBox controllers
as reported by evdev, despite the PS3 controller having buttons
physically labelled "start" and "select", plus direct equivalents to
many of the XBox 360 controller's parts (ie: TL, TR, MODE, ABS_HAT0X,
ABS_HAT0Y, ABS_RX, ABS_RY...).

The PS3 controller also has several "(?)" entries for buttons and
axis values, and also appears to have (if I understand correctly) a
bunch of codes for a multitouch panel?  I couldn't tell you what the
right or left stick axis values are in the above, because though I did
build my kernel with ps3 controller support, and evtest did see it and
dump the supported event list, I get no events logged from... ah, ok,
I have to hit the PS button to get it to actually work.  And now
there's a torrent of what I assume is accelerometer data coming in on
"(?)" events.

It turns out the left stick is ABS_X and ABS_Y, and right stick is
ABS_Z and ABS_RZ.  I suspect this is just broken somehow.  Maybe the
ps3 gamepad kernel driver is still a work in progress?  But this is
the kind of thing I was talking about; the data I get from a ps3
gamepad is mapped totally differently from the data I get from an xbox
gamepad, so from a game point of view, even if all I want is a
joystick, a jump button and a shoot button, I still have to care what
particular kind of gamepad the player has plugged in because I'm going
to get completely different button messages depending on what kind of
pad is plugged in.

> The current behaviour can be checked with evtest:
> http://cgit.freedesktop.org/evtest/
> Was that what you used to check the controller events?

   Previously, I'd been dumping data from the libjsw interface, and
then dumping data by going directly through evdev.  This time I used
evtest.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-23 Thread David Herrmann
Hi Todd

On Tue, Apr 23, 2013 at 4:12 PM, Todd Showalter  wrote:
> On Tue, Apr 23, 2013 at 7:25 AM, Pekka Paalanen  wrote:
>
>> what you describe here is very much a keymap-like database for game
>> controllers: translating from button and axis indices to labels or
>> symbols. However, having a brief chat with Daniel Stone, it seems we
>> should not need these.
>>
>> Take a look at /usr/include/linux/input.h
>>
>> There you find definitions for BTN_A, BTN_X, BTN_START, ABS_X, ABS_Y,
>> ABS_RX, ABS_RY, ABS_HATnn, and many more. The kernel evdev interface
>> should alreay be giving out events with the correct label, so we would
>> not need any mapping.
>>
>> Are you saying that the kernel gives out the labels wrong? If so, this
>> should be fixed in the kernel drivers. One thing less to care about in
>> Wayland. We "just" need to write the protocol for these devices, the
>> labels should be already there.
>
> I'm not saying the labels are wrong; I assume they are correct.
> The problem is that the labels are hardware-specific, at least for the
> buttons.  That said, it looks like the axis values are being properly
> labelled, which means we're way closer to sane behaviour than we were
> last time I looked into this.
>
> I grabbed evtest and ran it with three devices; an xbox
> controller, an xbox 360 controller, and a ps3 controller.  The
> results:
>
> xbox:
> - button
>A B X Y START THUMBL THUMBR
>Z (white button)
>C (black button)
>SELECT (back button)
> - axis
>ABS_X ABS_Y ABS_Z
>ABS_RX ABS_RY ABS_RZ
>ABS_HAT0X ABS_HAT0Y (dpad)
>
> xbox 360:
> - button
>A B X Y START THUMBL THUMBR
>TL  TR
>MODE (home button)
>SELECT (back button)
> - axis
>ABS_X ABS_Y ABS_Z
>ABS_RX ABS_RY ABS_RZ
>ABS_HAT0X ABS_HAT0Y (dpad)
>
> ps3:
>   - button
> TRIGGER  THUMB  THUMB2
> TOP TOP2 PINKIE BASE  BASE2
> BASE3  BASE3  BASE4  BASE5
> BASE6  DEAD  TRIGGER_HAPPY17
> TRIGGER_HAPPY18  TRIGGER_HAPPY19
>   - axis
> ABS_X  ABS_Y  ABS_Z  ABS_RZ  ABS_MISC
>
> The xbox controller and the xbox 360 controller are more or less
> the same; the 360 controller has a couple of shoulder buttons instead
> of a the black and white buttons, and (somewhat oddly) the "back"
> buttons come in as "select", but that's workable.
>
> It all rather goes pear-shaped when we get beyond that, though.
> The PS3 controller, while physically quite similar to the other two,
> even down to the placement of controls and how the controls are
> clustered, comes in completely differently.  There is not a single
> button in common between the PS3 controller and the XBox controllers
> as reported by evdev, despite the PS3 controller having buttons
> physically labelled "start" and "select", plus direct equivalents to
> many of the XBox 360 controller's parts (ie: TL, TR, MODE, ABS_HAT0X,
> ABS_HAT0Y, ABS_RX, ABS_RY...).

That's a known problem that isn't easy to fix. Of course, we can
adjust the kernel driver, but this breaks applications that expect the
current behavior. The problem I see is that we never paid enough
attention how keys are mapped when adding kernel drivers and now we
have a mess of different mappings that cannot be fixed. You can try to
send patches to linux-in...@vger.kernel.org, but chances are low that
they will get merged.

Nevertheless, the problem is actually easy to fix in userspace. You
just need to create buttons mappings for the different devices. There
is no complex logic involved. It would be enough to have a bunch of
static tables indexed by input-device names which map the input
keycode to the correct/expected output keycode.

But I cannot see a reason why a compositor should do this, though.
This can be easily put into a library and every game that needs it
performs the mappings. Table-mappings add a single memory-read which
shouldn't affect performance and clients can share the library.
The compositor isn't interested in joystick/gamepad events so my first
approach would be to let clients handle them.

> The PS3 controller also has several "(?)" entries for buttons and
> axis values, and also appears to have (if I understand correctly) a
> bunch of codes for a multitouch panel?  I couldn't tell you what the
> right or left stick axis values are in the above, because though I did
> build my kernel with ps3 controller support, and evtest did see it and
> dump the supported event list, I get no events logged from... ah, ok,
> I have to hit the PS button to get it to actually work.  And now
> there's a torrent of what I assume is accelerometer data coming in on
> "(?)" events.
>
> It turns out the left stick is ABS_X and ABS_Y, and right stick is
> ABS_Z and ABS_RZ.  I suspect this is just broken somehow.  Maybe the
> ps3 gamepad kernel driver is still a work in progress?  But this is
> the kind of thing I was talking about; the data I get from a ps3
> gamepad is ma

Re: Input and games.

2013-04-24 Thread Pekka Paalanen
On Wed, 24 Apr 2013 08:26:19 +0200
David Herrmann  wrote:

> Hi Todd
> 
> On Tue, Apr 23, 2013 at 4:12 PM, Todd Showalter  wrote:
> > On Tue, Apr 23, 2013 at 7:25 AM, Pekka Paalanen  wrote:
> >
> >> what you describe here is very much a keymap-like database for game
> >> controllers: translating from button and axis indices to labels or
> >> symbols. However, having a brief chat with Daniel Stone, it seems we
> >> should not need these.
> >>
> >> Take a look at /usr/include/linux/input.h
> >>
> >> There you find definitions for BTN_A, BTN_X, BTN_START, ABS_X, ABS_Y,
> >> ABS_RX, ABS_RY, ABS_HATnn, and many more. The kernel evdev interface
> >> should alreay be giving out events with the correct label, so we would
> >> not need any mapping.
> >>
> >> Are you saying that the kernel gives out the labels wrong? If so, this
> >> should be fixed in the kernel drivers. One thing less to care about in
> >> Wayland. We "just" need to write the protocol for these devices, the
> >> labels should be already there.
> >
> > I'm not saying the labels are wrong; I assume they are correct.
> > The problem is that the labels are hardware-specific, at least for the
> > buttons.  That said, it looks like the axis values are being properly
> > labelled, which means we're way closer to sane behaviour than we were
> > last time I looked into this.
> >
> > I grabbed evtest and ran it with three devices; an xbox
> > controller, an xbox 360 controller, and a ps3 controller.  The
> > results:
> >
> > xbox:
> > - button
> >A B X Y START THUMBL THUMBR
> >Z (white button)
> >C (black button)
> >SELECT (back button)
> > - axis
> >ABS_X ABS_Y ABS_Z
> >ABS_RX ABS_RY ABS_RZ
> >ABS_HAT0X ABS_HAT0Y (dpad)
> >
> > xbox 360:
> > - button
> >A B X Y START THUMBL THUMBR
> >TL  TR
> >MODE (home button)
> >SELECT (back button)
> > - axis
> >ABS_X ABS_Y ABS_Z
> >ABS_RX ABS_RY ABS_RZ
> >ABS_HAT0X ABS_HAT0Y (dpad)
> >
> > ps3:
> >   - button
> > TRIGGER  THUMB  THUMB2
> > TOP TOP2 PINKIE BASE  BASE2
> > BASE3  BASE3  BASE4  BASE5
> > BASE6  DEAD  TRIGGER_HAPPY17
> > TRIGGER_HAPPY18  TRIGGER_HAPPY19
> >   - axis
> > ABS_X  ABS_Y  ABS_Z  ABS_RZ  ABS_MISC
> >
> > The xbox controller and the xbox 360 controller are more or less
> > the same; the 360 controller has a couple of shoulder buttons instead
> > of a the black and white buttons, and (somewhat oddly) the "back"
> > buttons come in as "select", but that's workable.
> >
> > It all rather goes pear-shaped when we get beyond that, though.
> > The PS3 controller, while physically quite similar to the other two,
> > even down to the placement of controls and how the controls are
> > clustered, comes in completely differently.  There is not a single
> > button in common between the PS3 controller and the XBox controllers
> > as reported by evdev, despite the PS3 controller having buttons
> > physically labelled "start" and "select", plus direct equivalents to
> > many of the XBox 360 controller's parts (ie: TL, TR, MODE, ABS_HAT0X,
> > ABS_HAT0Y, ABS_RX, ABS_RY...).
> 
> That's a known problem that isn't easy to fix. Of course, we can
> adjust the kernel driver, but this breaks applications that expect the
> current behavior. The problem I see is that we never paid enough
> attention how keys are mapped when adding kernel drivers and now we
> have a mess of different mappings that cannot be fixed. You can try to
> send patches to linux-in...@vger.kernel.org, but chances are low that
> they will get merged.
> 
> Nevertheless, the problem is actually easy to fix in userspace. You
> just need to create buttons mappings for the different devices. There
> is no complex logic involved. It would be enough to have a bunch of
> static tables indexed by input-device names which map the input
> keycode to the correct/expected output keycode.
> 
> But I cannot see a reason why a compositor should do this, though.
> This can be easily put into a library and every game that needs it
> performs the mappings. Table-mappings add a single memory-read which
> shouldn't affect performance and clients can share the library.
> The compositor isn't interested in joystick/gamepad events so my first
> approach would be to let clients handle them.
> 
> > The PS3 controller also has several "(?)" entries for buttons and
> > axis values, and also appears to have (if I understand correctly) a
> > bunch of codes for a multitouch panel?  I couldn't tell you what the
> > right or left stick axis values are in the above, because though I did
> > build my kernel with ps3 controller support, and evtest did see it and
> > dump the supported event list, I get no events logged from... ah, ok,
> > I have to hit the PS button to get it to actually work.  And now
> > there's a torrent of what I assume is accelerometer data coming in on
> > "(?)" events.
> >
> > It turns 

Re: Input and games.

2013-04-24 Thread Todd Showalter
On Wed, Apr 24, 2013 at 2:26 AM, David Herrmann  wrote:

> That's a known problem that isn't easy to fix. Of course, we can
> adjust the kernel driver, but this breaks applications that expect the
> current behavior. The problem I see is that we never paid enough
> attention how keys are mapped when adding kernel drivers and now we
> have a mess of different mappings that cannot be fixed. You can try to
> send patches to linux-in...@vger.kernel.org, but chances are low that
> they will get merged.

In the particular case of the ps3 controller, I have trouble
believing anything is depending on the current behaviour, at least as
I see it.  I haven't tried to use the data, but the button and axis
mappings just look wrong.  The right stick is coming in as (ABS_Z,
ABS_RZ), which is how the left and right shoulder triggers are mapped
on the xbox controller.  The buttons have names like DEAD and "(?)",
the latter of which looks to be evtest's name lookup failure string.

The more general problem, though, is that it means we have to
treat every gamepad as unique, when they should be as ubiquitous and
generic as mice.

Sitting next to me here is a Kensington Slimblade trackball.  It's
my "mouse".  It has two axis control like a mouse, four buttons,
pretends to have a scroll wheel (rotating the ball around the z axis
clockwise or counterclockwise sends mouse wheel clicks), and generally
works.

The top two buttons on it are mapped totally wrong on the wire
when it talks over USB.  Rather than coming in as buttons 2 and 3, the
stupid thing sets flags in an option field in the USB data packets.
There's a quirk in the kernel driver to remap them to BTN_MIDDLE and
BTN_SIDE, so as far as anything reading the evdev stream is concerned,
it's just a mouse like any other, with four buttons and a wheel.

I can't do that with gamepads.  I should be able to.  The core
idea of a gamepad (two sticks, two analog shoulder triggers, a dpad, a
"home" button, a "start" button, two shoulder buttons, four face
buttons opposite the dpad, and click on the two sticks) is universal
enough at this point that it ought to be considered a basic system
abstraction, like the core idea of a mouse.

> Nevertheless, the problem is actually easy to fix in userspace. You
> just need to create buttons mappings for the different devices. There
> is no complex logic involved. It would be enough to have a bunch of
> static tables indexed by input-device names which map the input
> keycode to the correct/expected output keycode.

"easy to fix in userspace" rather ignores the deployment problem,
and it also ignores the disconnect between the people adding hardware
support and the people mapping it to something useful.

If they can't just be fixed in place, what I'd much rather see is
one of two fixes:

a) an ioctl() or some similar knob I can frob to say "deliver events
in canonical form, please"

b) have gampads produce *two* /dev/input/eventX entries, one for
"Microsoft X-Box 360 pad" or whatever, and the other for "system
gamepad"; deliver events in the current form over the named device,
and in canonical form over the "system gamepad" device

Of the two solutions, I prefer the latter; it means I can walk
down the list looking for "system gamepad" and only open those, but
know that I'll get data I don't have to massage.  It also means that
as soon as the kernel supports a new bit of hardware, it will Just
Work. We won't have to deal with keeping an external library in sync.
We won't have to try to examine the output of EVIOCGNAME() and try to
figure out if this is one of the devices we support.  We won't fail to
use an XBox 360 controller just because the device string said
"MadCatz" instead of "Microsoft" and we'd never seen one of those in
the wild and hadn't added it to our list of acceptable device name
strings.  Or because we were using vendor IDs and didn't have that
vendor ID on our list.

I think Linux actually has a decent shot in the next five years of
becoming the OS for living room game and streaming media PCs (heck,
we've got Valve sort-of onside, Microsoft's sitting unloved in the
win8 pit, and Apple has been heading in the direction of more
iOS/portable, not sit-down computing), but if it's going to happen
then things like gamepads need to Just Work.  If we're at a place
where the FAQ is telling people that in order to support their new
Logitech MustThraster JoyBanger they need to pull version 0.19.7-r3 or
later of libGamepadTrans.so from github, it all goes pear-shaped for
most people fast.

> But I cannot see a reason why a compositor should do this, though.
> This can be easily put into a library and every game that needs it
> performs the mappings. Table-mappings add a single memory-read which
> shouldn't affect performance and clients can share the library.
> The compositor isn't interested in joystick/gamepad events so my first
> approach would be to let clients handle them.

I don't think the 

Re: Input and games.

2013-04-24 Thread Jason Ekstrand
On Wed, Apr 24, 2013 at 9:41 AM, Todd Showalter wrote:

> On Wed, Apr 24, 2013 at 2:26 AM, David Herrmann 
> wrote:
>
> > That's a known problem that isn't easy to fix. Of course, we can
> > adjust the kernel driver, but this breaks applications that expect the
> > current behavior. The problem I see is that we never paid enough
> > attention how keys are mapped when adding kernel drivers and now we
> > have a mess of different mappings that cannot be fixed. You can try to
> > send patches to linux-in...@vger.kernel.org, but chances are low that
> > they will get merged.
>
> In the particular case of the ps3 controller, I have trouble
> believing anything is depending on the current behaviour, at least as
> I see it.  I haven't tried to use the data, but the button and axis
> mappings just look wrong.  The right stick is coming in as (ABS_Z,
> ABS_RZ), which is how the left and right shoulder triggers are mapped
> on the xbox controller.  The buttons have names like DEAD and "(?)",
> the latter of which looks to be evtest's name lookup failure string.
>
> The more general problem, though, is that it means we have to
> treat every gamepad as unique, when they should be as ubiquitous and
> generic as mice.
>
> Sitting next to me here is a Kensington Slimblade trackball.  It's
> my "mouse".  It has two axis control like a mouse, four buttons,
> pretends to have a scroll wheel (rotating the ball around the z axis
> clockwise or counterclockwise sends mouse wheel clicks), and generally
> works.
>
> The top two buttons on it are mapped totally wrong on the wire
> when it talks over USB.  Rather than coming in as buttons 2 and 3, the
> stupid thing sets flags in an option field in the USB data packets.
> There's a quirk in the kernel driver to remap them to BTN_MIDDLE and
> BTN_SIDE, so as far as anything reading the evdev stream is concerned,
> it's just a mouse like any other, with four buttons and a wheel.
>
> I can't do that with gamepads.  I should be able to.  The core
> idea of a gamepad (two sticks, two analog shoulder triggers, a dpad, a
> "home" button, a "start" button, two shoulder buttons, four face
> buttons opposite the dpad, and click on the two sticks) is universal
> enough at this point that it ought to be considered a basic system
> abstraction, like the core idea of a mouse.
>
> > Nevertheless, the problem is actually easy to fix in userspace. You
> > just need to create buttons mappings for the different devices. There
> > is no complex logic involved. It would be enough to have a bunch of
> > static tables indexed by input-device names which map the input
> > keycode to the correct/expected output keycode.
>
> "easy to fix in userspace" rather ignores the deployment problem,
> and it also ignores the disconnect between the people adding hardware
> support and the people mapping it to something useful.
>
> If they can't just be fixed in place, what I'd much rather see is
> one of two fixes:
>
> a) an ioctl() or some similar knob I can frob to say "deliver events
> in canonical form, please"
>
> b) have gampads produce *two* /dev/input/eventX entries, one for
> "Microsoft X-Box 360 pad" or whatever, and the other for "system
> gamepad"; deliver events in the current form over the named device,
> and in canonical form over the "system gamepad" device
>
> Of the two solutions, I prefer the latter; it means I can walk
> down the list looking for "system gamepad" and only open those, but
> know that I'll get data I don't have to massage.  It also means that
> as soon as the kernel supports a new bit of hardware, it will Just
> Work. We won't have to deal with keeping an external library in sync.
> We won't have to try to examine the output of EVIOCGNAME() and try to
> figure out if this is one of the devices we support.  We won't fail to
> use an XBox 360 controller just because the device string said
> "MadCatz" instead of "Microsoft" and we'd never seen one of those in
> the wild and hadn't added it to our list of acceptable device name
> strings.  Or because we were using vendor IDs and didn't have that
> vendor ID on our list.
>
> I think Linux actually has a decent shot in the next five years of
> becoming the OS for living room game and streaming media PCs (heck,
> we've got Valve sort-of onside, Microsoft's sitting unloved in the
> win8 pit, and Apple has been heading in the direction of more
> iOS/portable, not sit-down computing), but if it's going to happen
> then things like gamepads need to Just Work.  If we're at a place
> where the FAQ is telling people that in order to support their new
> Logitech MustThraster JoyBanger they need to pull version 0.19.7-r3 or
> later of libGamepadTrans.so from github, it all goes pear-shaped for
> most people fast.
>
> > But I cannot see a reason why a compositor should do this, though.
> > This can be easily put into a library and every game that needs it
> > performs the mappings. Table-mappings add a single memo

Re: Input and games.

2013-04-24 Thread Todd Showalter
On Wed, Apr 24, 2013 at 11:03 AM, Jason Ekstrand  wrote:

> I realize that my little Android project shouldn't be the sole driver of
> protocol decisions, but I don't think that is the only case where game
> controller events would come from something that's not evdev.  As another
> example, people have talked about Wayland on FreeBSD; how does FreeBSD
> handle game controllers?  Can we assume that some sort of evdev fd passing
> will work there and on Linux in any sort of reasonable way?

I haven't run FreeBSD for a while, but my memory of how game
controllers were handled is that it was not far removed from just
throwing the USB packets at the client and letting the client figure
it out.  Hopefully it has improved.

The core of my argument here is that there should be a standard
gamepad coming through the event system, much like the standard mouse
does.  The standard gamepad would be:

- left analog stick
- right analog stick
- left analog trigger
- right analog trigger
- dpad
- home button (ps3 ps, xbox glowy x, wii home)
- start button
- left shoulder button
- right shoulder button
- face top button (ps3 triangle, xbox Y)
- face left button (ps3 square, xbox X)
- face right button (ps3 circle, xbox B)
- face bottom button (ps3 x, xbox A)

An actual gamepad could generate more events than this (xbox has a
back button, ps3 has a select button, ps3 also has accelerometers...),
and some stripped down gamepads might not be able to produce all
events (no analog triggers, perhaps, or no home button), but what the
gamepad has that matches the spec should produce the standard events.

As a game I want to be able to say: "oh, a gamepad, ABS_X and
ABS_Y are the stick, BTN_FACE_SOUTH is jump, BTN_FACE_EAST is shoot,
and BTN_START brings up the pause menu".  I don't want to have to
dlopen("libGameTrans.so", RTLD_NOW), then call a function to march the
event list looking for supported gamepads, then call functions to hook
up translation layers and so forth.  We don't make clients do this for
mice, they shouldn't have to for gamepads.

We're into the weeds on evdev a bit because that's what happens to
be producing these events on Linux, but my ultimate concern is what a
client has to do in order to use a gamepad in the Wayland world.  I
would like that process to be as sane and trouble-free as possible,
regardless of what Wayland is sitting on top of.  So, I would prefer
things to Just Work on your project as well.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-24 Thread Bill Spitzak

On 04/23/2013 11:26 PM, David Herrmann wrote:


I'm currently looking into an interface that provides file-descriptors
for wl_keyboard/wl_mouse for clients. The FDs are muted (EVIOCMUTE
proposed on linux-input by krh) while clients are inactive and unmuted
when they get input focus. This is basically a performance boost
because input events no longer pass through the compositor.
However, this mechanism could be easily used to forward any other
input fd to clients. A wl_gamepad interface could be just empty except
for this FD-passing logic.


Anything like this (and also database of device key mappings) is going 
to have trouble if Wayland supports remote clients. All of this would 
have to be forwarded to the client from the remote display, introducing 
a lot of complexity and nasty bugs when clients disagree with the server 
about what device mapping is being used. It also looks impossible to 
support apis like RDP where this translation is already done, except by 
really kludgy inverse keymaps (which cause stupid bugs in NX right now 
so I think they are a bad idea).


Also even the current wayland behavior with keyboards seems to conflict 
with input methods. You communicate with input methods with wayland 
requests and events, and the input method has to do all the work of 
decoding the keystrokes anyway.


___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-24 Thread Rick Yorgason
Todd Showalter  writes:

> The core of my argument here is that there should be a standard
> gamepad coming through the event system, much like the standard mouse
> does.  The standard gamepad would be: 

For reference, in the Windows XP days joystick input was done with
DirectInput, which was designed to be as flexible as possible, work with any
input device, and even went as far as to let you query human-readable names
for devices and buttons.

Now they've deprecated DirectInput for the much simpler XInput, which lays
out the controller like so:

http://msdn.microsoft.com/en-ca/library/windows/desktop/microsoft.directx_sdk.reference.xinput_gamepad%28v=vs.85%29.aspx

That's a nice set of buttons/axes to use as a standard abstraction, although
it would be nice if they had built that on top of DirectInput's flexibility.
Having a sane default configuration is great, but in XInput it comes at the
cost of not allowing players to customize their controls to support more
exotic hardware. It would be amazing if Wayland/evdev was designed around
this middle-ground.

One thing I would expect a joystick abstraction to do that I don't expect a
mouse abstraction to do is, if I plug two mice into a system I expect them
both to control the same cursor, but with joysticks I always want to know
which joystick is sending each message.

(By the way, I like Todd's north/east/south/west abstraction for the face
buttons. It's probably also safe to abstract start/back into
startnext/backselect. XInput notably does not allow access to the home
button, and even on Linux it would probably be bad form for games to use the
home button, but a low-level protocol would need to define it so it could be
used for things like Steam's Big Picture or media centre compositors.)

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-24 Thread Todd Showalter
On Wed, Apr 24, 2013 at 5:03 PM, Rick Yorgason  wrote:

>> The core of my argument here is that there should be a standard
>> gamepad coming through the event system, much like the standard mouse
>> does.  The standard gamepad would be: 
>
> For reference, in the Windows XP days joystick input was done with
> DirectInput, which was designed to be as flexible as possible, work with any
> input device, and even went as far as to let you query human-readable names
> for devices and buttons.
>
> Now they've deprecated DirectInput for the much simpler XInput, which lays
> out the controller like so:
>
> http://msdn.microsoft.com/en-ca/library/windows/desktop/microsoft.directx_sdk.reference.xinput_gamepad%28v=vs.85%29.aspx

They can get away with that these days because things are so much
more homogeneous these days; when DirectInput was new, PC game
controllers were all over the place in terms of functionality;
gamepads tended to be modelled on the SNES controller (with no analog
controls) and joysticks were becoming grotesquely baroque collections
of axis values, force feedback systems and buttons as the flight
simulator market gradually specialized itself to death.

These days everyone has pretty much settled down to the same
formula, because it works reasonably and is generally familiar.  There
are things the basic gamepad doesn't cover (racing wheels, proper
flight sim controls, motion devices...), and maybe some day it would
be nice to have standards for those too, but I think right now the
time is more than ripe for gamepads.

> That's a nice set of buttons/axes to use as a standard abstraction, although
> it would be nice if they had built that on top of DirectInput's flexibility.
> Having a sane default configuration is great, but in XInput it comes at the
> cost of not allowing players to customize their controls to support more
> exotic hardware. It would be amazing if Wayland/evdev was designed around
> this middle-ground.

In its current state with evdev, it appears what we get is a
stream of axis and button events, but the index of those events is
arbitrary (ie: the ps3 right stick doesn't use the same axes as the
xbox 360 controller, and the buttons have no overlap at all), and
there's no way to query what the actual axis or button values are
without a priori knowledge.  You need to know that if the device
string is "Microsoft X-Box 360 pad" you need to map the incoming
events according to the template for that device.

I don't mind that for the functionality beyond the basic gamepad
abstraction, but without the basic gamepad abstraction there it makes
something that should be simple a hassle both for the developer and
the end user.  If we can get the core gamepad into evdev and pass the
nonstandard events through at higher index values, I think we get
everything we want out of it, at least on Linux.

> One thing I would expect a joystick abstraction to do that I don't expect a
> mouse abstraction to do is, if I plug two mice into a system I expect them
> both to control the same cursor, but with joysticks I always want to know
> which joystick is sending each message.

Yes, definitely.  Which also leads to the whole device naming
question; ie: if someone unplugs a controller and plugs it back in,
how do you make sure player 2 stays player 2?  But I think as long as
the rules are simple (if one controller disappears and reappears,
assume it's the same player, if multiple controllers disappear and
reappear at the same time, well, pilot error, user gets whatever they
get), it's largely a problem that can be papered over.

> (By the way, I like Todd's north/east/south/west abstraction for the face
> buttons. It's probably also safe to abstract start/back into
> startnext/backselect. XInput notably does not allow access to the home
> button, and even on Linux it would probably be bad form for games to use the
> home button, but a low-level protocol would need to define it so it could be
> used for things like Steam's Big Picture or media centre compositors.)

Our game engine has been running on all sorts of stuff over the
years, so we've seen controls called all sorts of things.  The best
part is the XBox controllers vs. the Nintendo controllers; Nintendo is
(clockwise from the top) xaby, while XBox is ybax.  So, the x and y
buttons are in the swapped and the a and b buttons are swapped.  As a
result, I tend to prefer something more abstract.

Actually, IIRC for a while in the original PlayStation libraries
you had the option of referring to the face buttons as if they were a
second dpad; there was a parallel set of #defines for the button masks
that were RPAD_UP, RPAD_RIGHT and so forth.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-25 Thread Henri Tuhola
On Thu, Apr 25, 2013 at 1:29 AM, Todd Showalter wrote:

> On Wed, Apr 24, 2013 at 5:03 PM, Rick Yorgason  wrote:
>
> > One thing I would expect a joystick abstraction to do that I don't
> expect a
> > mouse abstraction to do is, if I plug two mice into a system I expect
> them
> > both to control the same cursor, but with joysticks I always want to know
> > which joystick is sending each message.
>
> Yes, definitely.  Which also leads to the whole device naming
> question; ie: if someone unplugs a controller and plugs it back in,
> how do you make sure player 2 stays player 2?  But I think as long as
> the rules are simple (if one controller disappears and reappears,
> assume it's the same player, if multiple controllers disappear and
> reappear at the same time, well, pilot error, user gets whatever they
> get), it's largely a problem that can be papered over.


Although there doesn't seem to be unique identifiers in the devices. It is
possible to identify a controller by using the USB port it was connected
into, and the device class. You can use the same technique to identify an
USB hub.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-25 Thread Todd Showalter
On 2013-04-25, at 5:38 AM, Henri Tuhola  wrote:

> On Thu, Apr 25, 2013 at 1:29 AM, Todd Showalter  wrote:
>> 
>> Yes, definitely.  Which also leads to the whole device naming
>> question; ie: if someone unplugs a controller and plugs it back in,
>> how do you make sure player 2 stays player 2?  But I think as long as
>> the rules are simple (if one controller disappears and reappears,
>> assume it's the same player, if multiple controllers disappear and
>> reappear at the same time, well, pilot error, user gets whatever they
>> get), it's largely a problem that can be papered over.
> 
> Although there doesn't seem to be unique identifiers in the devices. It is 
> possible to identify a controller by using the USB port it was connected 
> into, and the device class. You can use the same technique to identify an USB 
> hub.

True, and you can use that and other hints (vendor & device Ids or strings and 
so forth), but you can't count on people plugging the pad back into the same 
port (my living room PC has something like 14 USB ports, and they are around 
the back where I can't see them).

Once you get into Bluetooth controllers where there is no physical wire...

  Todd.

--
  Todd Showalter, President
  Electron Jump Games, Inc.___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-25 Thread Pekka Paalanen
On Wed, 24 Apr 2013 10:03:35 -0500
Jason Ekstrand  wrote:

> On Wed, Apr 24, 2013 at 9:41 AM, Todd Showalter
> wrote:
> 
> > On Wed, Apr 24, 2013 at 2:26 AM, David Herrmann
> >  wrote:
> >
> > > I'm currently looking into an interface that provides
> > > file-descriptors for wl_keyboard/wl_mouse for clients. The FDs
> > > are muted (EVIOCMUTE proposed on linux-input by krh) while
> > > clients are inactive and unmuted when they get input focus. This
> > > is basically a performance boost because input events no longer
> > > pass through the compositor. However, this mechanism could be
> > > easily used to forward any other input fd to clients. A
> > > wl_gamepad interface could be just empty except for this
> > > FD-passing logic.
> >
> > Being able to talk to the gamepads through evdev without
> > elevated permissions would definitely be a step forward.
> >
> 
> Hi all,
> I hope you'll allow me to chip in just a bit here.  As a disclaimer, I
> haven't read the entire thread but I think I have a decent idea of
> what's gone on.
> 
> Personally, I would like to see some sort of a wl_joypad interface
> that isn't just an FD passthrough.  Sure, it will probably be
> optional and clients shouldn't assume it, but I think it would still
> be good to have it there.  Part of the reason is that, as of now, the
> wayland protocol isn't terribly restricted to standard Linux and I
> would like to avoid things that require Linux underneath such a an FD
> that's expected to be a Linux evdev. There are a couple of reasons
> for this.
> 
> One reason is if we want to do network forwarding.  Now, instead of
> simply providing a forwarded version of the wl_joypad we have to
> forward the evdev and possibly do some translation of the stream in
> order to make everthing happy (I'm not sure what all this would
> involve).  If the client expects to be able to do any ioctls on said
> stream things would get particularly interesting.

If it is evdev we deal with, you have also uinput. I've done evdev over
netcat myself, though getting it right when one device was a Linux
laptop, the other an Android phone, and in between not only netcat but
also some network relay tool from the Android SDK, it was quite a
stretch. Might have worked with less hickups, if I used TCP sockets
directly. But it worked enough, and Weston was happy on the receiving
end.

> Second is that events may not come directly from the kernel.  Once my
> Android compositor app gets beyond just displaying the simple-shm
> client, I'm going to want to forward input events including mouse,
> touch, virtual keyboard, USB keyboard, and gamepad events.  The
> problem is that NONE of those events come from evdev.  They all come
> from the Android event system and need to be translated into
> something useful.

Uinput solves that if we really want evdev fds. Maybe bluetooth device
representation as an evdev device works via uinput in BlueZ, too?

> I realize that my little Android project shouldn't be the sole driver
> of protocol decisions, but I don't think that is the only case where
> game controller events would come from something that's not evdev.
> As another example, people have talked about Wayland on FreeBSD; how
> does FreeBSD handle game controllers?  Can we assume that some sort
> of evdev fd passing will work there and on Linux in any sort of
> reasonable way?

I think this is an important point, and I'm starting to see Todd's
point of a standard protocol interface, too, assuming the kernel drivers
really are unfixable.

The ps3 controller evdev output sounds like no-one ever did any mapping
to it, just threw raw hardware indices out. Maybe I should try and
check that some day... or could it be that the driver was written for
an older controller model, and a newer model is simply missing the
quirks from the kernel driver?

Now, I believe I understand the value of a standard interface. Evdev
file descriptor passing however has a significant benefit, which is
probably the whole reason it is being looked into. I leaves the
compositor out of the input path. This also means, that the application
is not limited by the compositor's input forwarding rate. This makes
the input latency issues practically disappear, in addition to power
savings.

Looks like we need both: a standard interface with perhaps limited
capabilities, and the direct evdev fd passing interface for full
performance and features for those who want it.

Todd has already listed what features a standard gamepad or controller
has. Now someone should start designing the protocol. :-)


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-26 Thread Todd Showalter
On Thu, Apr 25, 2013 at 8:50 AM, Pekka Paalanen  wrote:

> Todd has already listed what features a standard gamepad or controller
> has. Now someone should start designing the protocol. :-)

Based on the wl_pointer docs, I should think it would look
something like this:

--8<--

wl_gamepad::connect -- gamepad appears
The connect event occurs when a gamepad is plugged in, attaches
via wireless, or otherwise becomes available for use.  The message can
also be generated in response to a client "enumerate gamepads" query.

The name argument may or may not be empty, depending on what
information the server has available, but if present can be used by
the client to determine what to do with wl_gamepad::extended events.

The pad_index is used to tell messages from different gamepads
apart; a system may have multiple gamepads connected, and messages
from them need to be distinguishable.  The pad_index value may be much
higher than the number of gamepads currently connected if the user has
been connecting and disconnecting gamepads in pathological ways.

The cookie may or may not contain useful data, and shouldn't be
counted on, but it is a hint to the application using the data whether
a connecting gamepad has been seen before.  If the hardware has a
software-visible serial number, the cookie should be a hash of that
value.  If the hardware is plugging in at a specific USB port, the
cookie should be a hash of the device path.  The cookie exists so that
if the application sees a gamepad disappear and then another gamepad
appears, if the cookie for the old and new controllers match it can
assume it has the same physical gamepad.

Arguments:
time -- uint -- standard event timestamp
name -- string -- device name
pad_index -- uint -- which gamepad this is
cookie -- uint -- unique device hash; UNRELIABLE, hint only


wl_gamepad::disconnect -- gamepad disappears
The disconnect event occurs when a gamepad becomes unavailable,
either due to unplugging or signal loss.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is


wl_gamepad::stick -- gamepad stick movement
A stick event occurs when there is stick movement on a gamepad.
It would be preferable if the protocol could handle float data, but
failing that the axis values can be mapped to a large integer range,
as below.  The precision of the fixed type is sufficient for most
current hardware, but not all; for example, the ps3 controller analog
sticks have a [-128 .. 127] range, but the ps3 dpad is actually
pressure sensitive, and therefore actually has an effective range of
[-255 .. 255].  It's not hard to imagine higher-precision controllers
in the future as prices come down.

The stick_index indicates which stick the message pertains to; for
hardware with more than the standard number of joysticks/thumbsticks,
higher index values are possible, but 0 is always left stick, 1 is
always right stick and 2 is always the dpad.  Even if the physical
hardware lacks one or more of those axis values, additional axis
values will be mapped above 2.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is
stick_index -- uint -- 0 for left stick, 1 for right stick, 2 for dpad
x -- int -- the x axis of the stick mapped to [-2^15 .. 2^15 - 1]
y -- int -- the y axis of the stick mapped to [-2^15 .. 2^15 - 1]


wl_gamepad::trigger -- gamepad analog trigger movement
A trigger event occurs when there is analog trigger movement on a
gamepad.  As with stick messages, it would be preferable if the axis
value could be sent as float, but failing that the value can be mapped
to a large integer range.

The trigger_index is 0 for left stick values and 1 for right stick
values.  Hardware with more triggers can potentially supply higher
values; the pressure-sensitive buttons on the ps3 controller would go
here, for instance.

Arguments:
time -- uint -- standard event timestamp
pad_index -- uint -- which gamepad this is
trigger_index -- uint -- 0 for left trigger, 1 for right trigger
x -- uint -- the trigger value mapped to [0 .. 2^15 - 1]


wl_gamepad::button -- gamepad button press
A button event occurs when a button is pressed or released.  The
standard buttons are:

0 - BUTTON_FACE_NORTH -- triangle on ps3, y on xbox
1 - BUTTON_FACE_EAST -- circle on ps3, b on xbox
2 - BUTTON_FACE_SOUTH -- x on ps3, a on xbox
3 - BUTTON_FACE_WEST -- square on ps3, x on xbox
4 - BUTTON_SHOULDER_LEFT -- L1 on ps3, LT on xbox
5 - BUTTON_SHOULDER_RIGHT -- R1 on ps3, RT on xbox
6 - BUTTON_LEFT_STICK -- left stick click
7 -

Re: Input and games.

2013-04-26 Thread Jason Ekstrand
Hi Todd,
Thanks for putting this together.  I have a few general comments here and
more below.

My first general comment is about floating point.  I'm not 100% sure what
all went into the design decision to make wl_fixed have 8 bits of
fractional precision vs. 12 or 16.  I'm guessing that they wanted the
increased integer capability, but you'd have to ask Kristian about that.
My understanding is that most game controllers work with ranges of [0,1] or
[-1,1] which would be wasteful to put into wl_fixed.  Looking below, it
seems as if you're fairly consistently picking a 16 bit fractional part.
That breaks out of the norm of the wire format a bit, but I think it's
justified in this case.  The big thing is to be consistent which it looks
like you're doing anyway.

Another concern is how to map [0, 255] onto [0, 2^15 - 1] cleanly.
Unfortunately, there is no good way to do this so that 0 -> 0 and 255 ->
2^15 - 1.  Perhaps that doesn't matter much for games since you're sensing
human movements which will be slightly different for each controller anyway.


On Fri, Apr 26, 2013 at 1:28 PM, Todd Showalter wrote:

> On Thu, Apr 25, 2013 at 8:50 AM, Pekka Paalanen 
> wrote:
>
> > Todd has already listed what features a standard gamepad or controller
> > has. Now someone should start designing the protocol. :-)
>
> Based on the wl_pointer docs, I should think it would look
> something like this:
>
> --8<--
>
> wl_gamepad::connect -- gamepad appears
> The connect event occurs when a gamepad is plugged in, attaches
> via wireless, or otherwise becomes available for use.  The message can
> also be generated in response to a client "enumerate gamepads" query.
>
> The name argument may or may not be empty, depending on what
> information the server has available, but if present can be used by
> the client to determine what to do with wl_gamepad::extended events.
>
> The pad_index is used to tell messages from different gamepads
> apart; a system may have multiple gamepads connected, and messages
> from them need to be distinguishable.  The pad_index value may be much
> higher than the number of gamepads currently connected if the user has
> been connecting and disconnecting gamepads in pathological ways.
>
> The cookie may or may not contain useful data, and shouldn't be
> counted on, but it is a hint to the application using the data whether
> a connecting gamepad has been seen before.  If the hardware has a
> software-visible serial number, the cookie should be a hash of that
> value.  If the hardware is plugging in at a specific USB port, the
> cookie should be a hash of the device path.  The cookie exists so that
> if the application sees a gamepad disappear and then another gamepad
> appears, if the cookie for the old and new controllers match it can
> assume it has the same physical gamepad.
>
> Arguments:
> time -- uint -- standard event timestamp
> name -- string -- device name
> pad_index -- uint -- which gamepad this is
> cookie -- uint -- unique device hash; UNRELIABLE, hint only
>

Do we really need connect and disconnect timestampped?  Are those
timestamps going to be reliable/useful?  When you plug in a device, it
takes a second or two just to detect and show up in /dev.  On that time
scale, "when did I see the event?" is just as accurate as any timestamp.


> 
> wl_gamepad::disconnect -- gamepad disappears
> The disconnect event occurs when a gamepad becomes unavailable,
> either due to unplugging or signal loss.
>
> Arguments:
> time -- uint -- standard event timestamp
> pad_index -- uint -- which gamepad this is
>
> 
> wl_gamepad::stick -- gamepad stick movement
> A stick event occurs when there is stick movement on a gamepad.
> It would be preferable if the protocol could handle float data, but
> failing that the axis values can be mapped to a large integer range,
> as below.  The precision of the fixed type is sufficient for most
> current hardware, but not all; for example, the ps3 controller analog
> sticks have a [-128 .. 127] range, but the ps3 dpad is actually
> pressure sensitive, and therefore actually has an effective range of
> [-255 .. 255].  It's not hard to imagine higher-precision controllers
> in the future as prices come down.
>
> The stick_index indicates which stick the message pertains to; for
> hardware with more than the standard number of joysticks/thumbsticks,
> higher index values are possible, but 0 is always left stick, 1 is
> always right stick and 2 is always the dpad.  Even if the physical
> hardware lacks one or more of those axis values, additional axis
> values will be mapped above 2.
>
> Arguments:
> time -- uint -- standard event timestamp
> pad_index -- uint -- which gamepad this is
> stick_index -- uint -- 0 for le

Re: Input and games.

2013-04-26 Thread Jason Ekstrand
Todd,
I think you forgot reply-all.  I add wayland-devel again.

On Fri, Apr 26, 2013 at 5:50 PM, Todd Showalter wrote:

> On Fri, Apr 26, 2013 at 5:46 PM, Jason Ekstrand 
> wrote:
>
> > My first general comment is about floating point.  I'm not 100% sure what
> > all went into the design decision to make wl_fixed have 8 bits of
> fractional
> > precision vs. 12 or 16.  I'm guessing that they wanted the increased
> integer
> > capability, but you'd have to ask Kristian about that.  My understanding
> is
> > that most game controllers work with ranges of [0,1] or [-1,1] which
> would
> > be wasteful to put into wl_fixed.  Looking below, it seems as if you're
> > fairly consistently picking a 16 bit fractional part.  That breaks out of
> > the norm of the wire format a bit, but I think it's justified in this
> case.
> > The big thing is to be consistent which it looks like you're doing
> anyway.
>
> In my experience, most game controllers actually return byte
> values which you wind up interpreting either as signed or unsigned
> depending on what makes sense.  Certainly that's been the case
> historically.  In games we typically do something like:
>
> stick.x = ((float)raw_x) / (raw_x >= 0) ? 127.0f : 128.0f;
> stick.y = ((float)raw_y) / (raw_y >= 0) ? 127.0f : 128.0f;
>
> > Another concern is how to map [0, 255] onto [0, 2^15 - 1] cleanly.
> > Unfortunately, there is no good way to do this so that 0 -> 0 and 255 ->
> > 2^15 - 1.  Perhaps that doesn't matter much for games since you're
> sensing
> > human movements which will be slightly different for each controller
> anyway.
>
> There is, actually:
>
> expanded = (base << 7) | (base >> 1);
>
> ie: repeat the bit pattern down into the lower bits.  Examples:
>
>  -> (000) | (111) -> 111
> 000 -> () | (000) -> 000
> 100 -> (1000) | (10) -> 1000100
> 1011001 -> (1011001000) | (101100) -> 1011001101100
>
> And so forth.  It's the same scheme you use when doing color
> channel expansion.  I haven't seen a rigorous mathematical proof that
> it's correct, but I'd be surprised if someone more so inclined than I
> hasn't come up with one.
>

Wow, I've never seen that one before.  And yes, it is provably exactly
correct (up to a little integer round-off because of the implicit right
shift by 1).  I guess I learned a new trick today; that's really cool!


> [wl_gamepad::connect and disconnect]
>
> > Do we really need connect and disconnect timestampped?  Are those
> timestamps
> > going to be reliable/useful?  When you plug in a device, it takes a
> second
> > or two just to detect and show up in /dev.  On that time scale, "when
> did I
> > see the event?" is just as accurate as any timestamp.
>
> It seemed like all events had timestamps as part of the protocol,
> so I didn't know how fundamental that was to the underlying system.
> The only reason connect and disconnect might need to be timestamped is
> if events are going to be batched up, you might possibly have ordering
> issues with delivery.  If that's not a problem and the underlying
> system doesn't require timestamps, they can go.
>
> [wl_gamepad::button]
>
> >> The trigger_index is 0 for left stick values and 1 for right stick
> >> values.  Hardware with more triggers can potentially supply higher
> >> values; the pressure-sensitive buttons on the ps3 controller would go
> >> here, for instance.
> >
> > Could you be more clear about what other pressure-sensitive buttons on
> the
> > PS3 controller you're referring to here?  I know they went a bit
> overboard
> > on pressure sensitivity in the PS3 controller and seem to recall that
> even
> > buttons like triangle etc. were pressure-sensitive.  That said, those
> > buttons should map as buttons not triggers so that they can be picked up
> in
> > a canonical way.  Are you simply planning to double-report events there?
>
> I included this as a "this data could work without breaking the
> protocol", but it's not essential.
>
> In the particular case of the ps3 (and all of the dual shock
> controllers, IIRC), all of the buttons are pressure sensitive with a
> [0..255] range except "start", "select", "home" (on pads that have it)
> and the stick clicks.  The face buttons, the dpad and all four
> shoulder buttons are pressure sensitive.  Whether it's worth exporting
> that is another question entirely; I've heard rumour that the ps4
> controller removes pressure sensing from a lot of the buttons.
>
> [wl_gamepad::extended]
>
> > My feeling on this would be to wait until we have a use-case for it.  We
> can
> > always bump the version and add an event if it comes up.  I think that's
> > better than just assuming we can do something sensible with four generic
> > parameters.
>
> This is partly in response to things like the razer Wiimote-like
> contraption that apparently spits out piles of quaternions, and also
> things like har

Re: Input and games.

2013-04-26 Thread Todd Showalter
On Fri, Apr 26, 2013 at 8:40 PM, Jason Ekstrand  wrote:

> I think you forgot reply-all.  I add wayland-devel again.

Blast.  Sorry about that.  Thanks!

>> There is, actually:
>>
>> expanded = (base << 7) | (base >> 1);
>>
>> ie: repeat the bit pattern down into the lower bits.  Examples:
>>
>>  -> (000) | (111) -> 111
>> 000 -> () | (000) -> 000
>> 100 -> (1000) | (10) -> 1000100
>> 1011001 -> (1011001000) | (101100) -> 1011001101100
>>
>> And so forth.  It's the same scheme you use when doing color
>> channel expansion.  I haven't seen a rigorous mathematical proof that
>> it's correct, but I'd be surprised if someone more so inclined than I
>> hasn't come up with one.
>
> Wow, I've never seen that one before.  And yes, it is provably exactly
> correct (up to a little integer round-off because of the implicit right
> shift by 1).  I guess I learned a new trick today; that's really cool!

AFAIK folks in graphics hardware have been using that trick at the
hardware level to do color channel expansion (ie: turning RGB565 into
RGB888 or the like) since at least the 90s, but like a lot of the more
clever bit manipulation tricks it's not that widely disseminated.  I
actually came up with it independently back in the 90s and was pretty
proud of myself before a co-worker I was explaining it shot me down
with "oh, *that*, yeah, that's what my raytracer does.". :)

I meant to mention in my original reply that although most
physical hardware (especially historical hardware) is linearly mapped
signed byte or unsigned byte axis values, I think a protocol that's
going to be relatively future proof needs to handle higher precision
and convert well to float.  Most games are going to want the sticks to
map cleanly to either digital up/down/left/right buttons or to [-1.0
.. 1.0] ranges, and both of those are easy translations from the [-32k
.. 32k] range.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Fwd: Input and games.

2013-04-26 Thread Todd Showalter
I failed to reply-all before, so I'm forwarding this back to the list.

On Fri, Apr 26, 2013 at 5:46 PM, Jason Ekstrand  wrote:

> My first general comment is about floating point.  I'm not 100% sure what
> all went into the design decision to make wl_fixed have 8 bits of fractional
> precision vs. 12 or 16.  I'm guessing that they wanted the increased integer
> capability, but you'd have to ask Kristian about that.  My understanding is
> that most game controllers work with ranges of [0,1] or [-1,1] which would
> be wasteful to put into wl_fixed.  Looking below, it seems as if you're
> fairly consistently picking a 16 bit fractional part.  That breaks out of
> the norm of the wire format a bit, but I think it's justified in this case.
> The big thing is to be consistent which it looks like you're doing anyway.

In my experience, most game controllers actually return byte
values which you wind up interpreting either as signed or unsigned
depending on what makes sense.  Certainly that's been the case
historically.  In games we typically do something like:

stick.x = ((float)raw_x) / (raw_x >= 0) ? 127.0f : 128.0f;
stick.y = ((float)raw_y) / (raw_y >= 0) ? 127.0f : 128.0f;

> Another concern is how to map [0, 255] onto [0, 2^15 - 1] cleanly.
> Unfortunately, there is no good way to do this so that 0 -> 0 and 255 ->
> 2^15 - 1.  Perhaps that doesn't matter much for games since you're sensing
> human movements which will be slightly different for each controller anyway.

There is, actually:

expanded = (base << 7) | (base >> 1);

ie: repeat the bit pattern down into the lower bits.  Examples:

 -> (000) | (111) -> 111
000 -> () | (000) -> 000
100 -> (1000) | (10) -> 1000100
1011001 -> (1011001000) | (101100) -> 1011001101100

And so forth.  It's the same scheme you use when doing color
channel expansion.  I haven't seen a rigorous mathematical proof that
it's correct, but I'd be surprised if someone more so inclined than I
hasn't come up with one.

[wl_gamepad::connect and disconnect]

> Do we really need connect and disconnect timestampped?  Are those timestamps
> going to be reliable/useful?  When you plug in a device, it takes a second
> or two just to detect and show up in /dev.  On that time scale, "when did I
> see the event?" is just as accurate as any timestamp.

It seemed like all events had timestamps as part of the protocol,
so I didn't know how fundamental that was to the underlying system.
The only reason connect and disconnect might need to be timestamped is
if events are going to be batched up, you might possibly have ordering
issues with delivery.  If that's not a problem and the underlying
system doesn't require timestamps, they can go.

[wl_gamepad::button]

>> The trigger_index is 0 for left stick values and 1 for right stick
>> values.  Hardware with more triggers can potentially supply higher
>> values; the pressure-sensitive buttons on the ps3 controller would go
>> here, for instance.
>
> Could you be more clear about what other pressure-sensitive buttons on the
> PS3 controller you're referring to here?  I know they went a bit overboard
> on pressure sensitivity in the PS3 controller and seem to recall that even
> buttons like triangle etc. were pressure-sensitive.  That said, those
> buttons should map as buttons not triggers so that they can be picked up in
> a canonical way.  Are you simply planning to double-report events there?

I included this as a "this data could work without breaking the
protocol", but it's not essential.

In the particular case of the ps3 (and all of the dual shock
controllers, IIRC), all of the buttons are pressure sensitive with a
[0..255] range except "start", "select", "home" (on pads that have it)
and the stick clicks.  The face buttons, the dpad and all four
shoulder buttons are pressure sensitive.  Whether it's worth exporting
that is another question entirely; I've heard rumour that the ps4
controller removes pressure sensing from a lot of the buttons.

[wl_gamepad::extended]

> My feeling on this would be to wait until we have a use-case for it.  We can
> always bump the version and add an event if it comes up.  I think that's
> better than just assuming we can do something sensible with four generic
> parameters.

This is partly in response to things like the razer Wiimote-like
contraption that apparently spits out piles of quaternions, and also
things like hardcore flightsticks that have things like fixed-range
throttles.  I'm not convinced it's needed either, but I figured if I
was making a proposed protocol it was worth throwing it in for the
sake of discussion.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailma

Re: Input and games.

2013-04-27 Thread nerdopolis
Hi.

What about rumblepads? How would the protocol control the rumblepad in some 
controllers?
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-27 Thread Todd Showalter
On Sat, Apr 27, 2013 at 10:23 AM, nerdopolis
 wrote:

> What about rumblepads? How would the protocol control the rumblepad in some
> controllers?

I think rumble is beyond the scope of the proposed gamepad
protocol.  The problem is that (at least in my experience) whereas
many controllers have some form of rumble, the actual physical rumble
hardware has very little in common between devices.  Some things have
linear motors, some have rotary motors.  Some have both.  Or multiples
of one or the other.  Some (like the wiimote, IIRC) aren't actually
rumble per se, and are actually a speaker you stream sound to at low
bit rates.

The other problem (at least to me) is that rumble flows in the
opposite direction from input; in the gamepad protocol I'm proposing,
all the data is flowing from the hardware through the server to the
application.  Rumble runs the other way.  Rumble in general is far
more akin to sound playback than it is to input; you're basically
feeding a control stream to an actuator.

Beyond that, at least with some gamepads, the rumble hardware is
subject to restrictions; in particular, at least one bit of kit I've
worked with that's still in circulation doesn't get enough power over
the controller cable to run all of the rumble motors at the same time,
and if you turn too much on the board-level logic in the gamepad can
start acting wonky.

So, the short version of my answer is (1) I think rumble is a
totally different problem domain that wants its own protocol, and (2)
I'm not sure there's enough commonality or sanity in rumble hardware
to build a useful protocol.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-29 Thread Pekka Paalanen
On Sat, 27 Apr 2013 15:02:59 -0400
Todd Showalter  wrote:

> On Sat, Apr 27, 2013 at 10:23 AM, nerdopolis
>  wrote:
> 
> > What about rumblepads? How would the protocol control the rumblepad
> > in some controllers?
> 
> I think rumble is beyond the scope of the proposed gamepad
> protocol.  The problem is that (at least in my experience) whereas
> many controllers have some form of rumble, the actual physical rumble
> hardware has very little in common between devices.  Some things have
> linear motors, some have rotary motors.  Some have both.  Or multiples
> of one or the other.  Some (like the wiimote, IIRC) aren't actually
> rumble per se, and are actually a speaker you stream sound to at low
> bit rates.
> 
> The other problem (at least to me) is that rumble flows in the
> opposite direction from input; in the gamepad protocol I'm proposing,
> all the data is flowing from the hardware through the server to the
> application.  Rumble runs the other way.  Rumble in general is far
> more akin to sound playback than it is to input; you're basically
> feeding a control stream to an actuator.
> 
> Beyond that, at least with some gamepads, the rumble hardware is
> subject to restrictions; in particular, at least one bit of kit I've
> worked with that's still in circulation doesn't get enough power over
> the controller cable to run all of the rumble motors at the same time,
> and if you turn too much on the board-level logic in the gamepad can
> start acting wonky.
> 
> So, the short version of my answer is (1) I think rumble is a
> totally different problem domain that wants its own protocol, and (2)
> I'm not sure there's enough commonality or sanity in rumble hardware
> to build a useful protocol.

Sounds like this should definitely be punted to the evdev file
descriptor passing interface. There the game needs to know (via a
library or by itself) how to drive the particular controller, anyway.
And the display server will not be in the way for relaying rumble
commands.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-29 Thread Pekka Paalanen
On Fri, 26 Apr 2013 14:28:30 -0400
Todd Showalter  wrote:

> On Thu, Apr 25, 2013 at 8:50 AM, Pekka Paalanen 
> wrote:
> 
> > Todd has already listed what features a standard gamepad or
> > controller has. Now someone should start designing the protocol. :-)
> 
> Based on the wl_pointer docs, I should think it would look
> something like this:
> 
> --8<--
> 
> wl_gamepad::connect -- gamepad appears
> The connect event occurs when a gamepad is plugged in, attaches
> via wireless, or otherwise becomes available for use.  The message can
> also be generated in response to a client "enumerate gamepads" query.
> 
> The name argument may or may not be empty, depending on what
> information the server has available, but if present can be used by
> the client to determine what to do with wl_gamepad::extended events.
> 
> The pad_index is used to tell messages from different gamepads
> apart; a system may have multiple gamepads connected, and messages
> from them need to be distinguishable.  The pad_index value may be much
> higher than the number of gamepads currently connected if the user has
> been connecting and disconnecting gamepads in pathological ways.
> 
> The cookie may or may not contain useful data, and shouldn't be
> counted on, but it is a hint to the application using the data whether
> a connecting gamepad has been seen before.  If the hardware has a
> software-visible serial number, the cookie should be a hash of that
> value.  If the hardware is plugging in at a specific USB port, the
> cookie should be a hash of the device path.  The cookie exists so that
> if the application sees a gamepad disappear and then another gamepad
> appears, if the cookie for the old and new controllers match it can
> assume it has the same physical gamepad.
> 
> Arguments:
> time -- uint -- standard event timestamp
> name -- string -- device name
> pad_index -- uint -- which gamepad this is
> cookie -- uint -- unique device hash; UNRELIABLE, hint only
> 
> 
> wl_gamepad::disconnect -- gamepad disappears
> The disconnect event occurs when a gamepad becomes unavailable,
> either due to unplugging or signal loss.
> 
> Arguments:
> time -- uint -- standard event timestamp
> pad_index -- uint -- which gamepad this is
> 

Hi Todd,

a problem here is that to receive a wl_gamepad::connect event, you
first have to create a wl_gamepad protocol object, which is a bit
counterintuitive.

A wl_gamepad protocol object should correspond to a single physical
device. So, we would have a wl_gamepad object per each contoller, and
you do not need the pad_index argument in any of the events. This
should be easier on the game programmer, too, since you can attach
user data to each protocol object in libwayland-client, and use that in
the callbacks without any global data.

That leaves the question, where do we put the connect and disconnect
event equivalents. Below you mention also keyboards, so instead of
playing ad-hoc, let's use wl_seats the way they are designed.

A wl_seat would need to grow a capability bit for wl_gamepad, and a
request to create a wl_gamepad object. When you use that request to
create a wl_gamepad, the first thing it does is send its description:
name and cookie in your proposal, as an extra event type.

That limits us to one wl_gamepad device per wl_seat, so a server needs
to create more wl_seats for more controllers. That shouldn't be any
problem, these seats would only have the gamepad capability by default.

If your gamepad actually had a keyboard, or maybe even a touchpad (if
that is supposed to be used as a pointer device), it would simply be
advertised as a standard wl_keyboard or wl_pointer on the wl_seat. Each
player would have their own wl_seat, and it is obvious which keyboard
belongs with which gamepad.

If we or the user wants the system keyboard and mouse, i.e. the things
not on a gamepad, as a separate seat instead of merged with one of the
gamepads, that is simply a server configuration thing. Think about a
desktop GUI, where you tick a box "keep system keyboard separate", kind
of thing.

Oh, and the disconnect event. The standard wl_seat way for that seems
to be a new capabilities event, with the gamepad bit unset.


All this still leaves some details unsolved, like which Wayland client
should receive the gamepad events on a wl_seat? The one having the
keyboard focus? Probably. But what if a wl_seat has no wl_keyboard or
wl_pointer? Just the first client that creates a wl_gamepad? What if
you have two games running, and you want to switch between them? Should
a wl_gamepad have its own focused surface attribute? How do you assign
that focus? If you have other input devices in addition to a gamepad on
a wl_seat, how do you get all their foci to the game, when the user
wants it? How does the user indicate he wants it?

Many of the

Re: Input and games.

2013-04-29 Thread Todd Showalter
On Mon, Apr 29, 2013 at 4:15 AM, Pekka Paalanen  wrote:

> a problem here is that to receive a wl_gamepad::connect event, you
> first have to create a wl_gamepad protocol object, which is a bit
> counterintuitive.
>
> A wl_gamepad protocol object should correspond to a single physical
> device. So, we would have a wl_gamepad object per each contoller, and
> you do not need the pad_index argument in any of the events. This
> should be easier on the game programmer, too, since you can attach
> user data to each protocol object in libwayland-client, and use that in
> the callbacks without any global data.

That's not necessarily how games are designed, though; I know that
with our engine, input is represented internally as an array of
gamepads.  It lets us do things like feed a pad with a stack of pad
values to do things like demos and "replays".

> That leaves the question, where do we put the connect and disconnect
> event equivalents. Below you mention also keyboards, so instead of
> playing ad-hoc, let's use wl_seats the way they are designed.
>
> A wl_seat would need to grow a capability bit for wl_gamepad, and a
> request to create a wl_gamepad object. When you use that request to
> create a wl_gamepad, the first thing it does is send its description:
> name and cookie in your proposal, as an extra event type.
>
> That limits us to one wl_gamepad device per wl_seat, so a server needs
> to create more wl_seats for more controllers. That shouldn't be any
> problem, these seats would only have the gamepad capability by default.

This is I think where there's a potential problem.  Gamepads live
in a somewhat more chaotic world than mice and keyboards; wireless
ones have much shorter battery lives, and players are used to being
able to unplug and plug them while playing.  It's not uncommon for
someone to (say) play for a bit with a gamepad they don't like (maybe
it was on sale), unplug it, and plug in one they like better.  Or drop
the gamepad that ran out of batteries in the charger and pull out
another.

Players also expect to be able to add a gamepad part way through a
game, at least for some games.

So, gamepads can appear and disappear during the game's runtime,
and the game needs to know that is happening.  There also need to be
some heuristics about which gamepad is what player (or seat).

> If your gamepad actually had a keyboard, or maybe even a touchpad (if
> that is supposed to be used as a pointer device), it would simply be
> advertised as a standard wl_keyboard or wl_pointer on the wl_seat. Each
> player would have their own wl_seat, and it is obvious which keyboard
> belongs with which gamepad.

That does solve that problem nicely.  It's somewhat of a corner
case, though, so I wouldn't move mountains to solve it.

> Oh, and the disconnect event. The standard wl_seat way for that seems
> to be a new capabilities event, with the gamepad bit unset.

Ok.

> All this still leaves some details unsolved, like which Wayland client
> should receive the gamepad events on a wl_seat? The one having the
> keyboard focus? Probably. But what if a wl_seat has no wl_keyboard or
> wl_pointer? Just the first client that creates a wl_gamepad? What if
> you have two games running, and you want to switch between them? Should
> a wl_gamepad have its own focused surface attribute? How do you assign
> that focus? If you have other input devices in addition to a gamepad on
> a wl_seat, how do you get all their foci to the game, when the user
> wants it? How does the user indicate he wants it?

I think all gamepad input should be routed to whatever has focus
or whatever has grabbed input.  I don't see a scenario where it makes
sense to route different gamepads separately unless you're doing
multiuser multihead (which I assume is the point of the wl_seat
abstraction).

>> Arguments:
>> time -- uint -- standard event timestamp
>> pad_index -- uint -- which gamepad this is
>> stick_index -- uint -- 0 for left stick, 1 for right stick, 2 for
>> dpad x -- int -- the x axis of the stick mapped to [-2^15 .. 2^15 - 1]
>> y -- int -- the y axis of the stick mapped to [-2^15 .. 2^15 - 1]
>
> All int and uint are 32-bit in the protocol, btw, should be enough
> precision for intervals like [0, 1] and [-1, 1], I think.
>
> I agree that the fixed type is not really suitable here. It was
> designed to hold pixel coordinates foremost.

I figured; it looked like a suitable encoding for subpixel
precision rather than for components of normalized vectors.  For input
axis values float remains the ideal, but fixed point with 16 bits
below the decimal should cover most things adequately.  There will
probably be roundoff error for things like quaternions, but those
would probably be better served by a 2.30 fixed format anyways.

>> 
>> wl_gamepad::sysbutton -- gamepad system button event
>> A sysbutton event occurs when the syst

Re: Input and games.

2013-04-29 Thread Bill Spitzak
Has anybody thought about pens (ie wacom tablets)? These have 5 degrees 
of freedom (most cannot distinguish rotation about the long axis of the 
pen). There are also spaceballs with full 6 degrees of freedom.


One idea I remember from Irix was that all the analog controls were 
1-dimensional. A mouse was actually 2 analog controls. This avoids the 
need to define how many degrees of freedom a control has, instead it is 
just N different controls. Quaternions are a problem though because the 
4 numbers are not independent so there must be a way to get a set of 
changes together.


Another idea was that buttons had the same api as analog controls, it's 
just that they only reported 0 or +1, never any fractions (and since it 
sounds like some controls have pressure-sensitive buttons this may make 
it easier to use the same code on different controls).

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-29 Thread Todd Showalter
On Mon, Apr 29, 2013 at 1:44 PM, Bill Spitzak  wrote:

> Has anybody thought about pens (ie wacom tablets)? These have 5 degrees of 
> freedom (most cannot distinguish rotation about the long axis of the pen). 
> There are also spaceballs with full 6 degrees of freedom.

I think pens need to be their own thing; the needs of pen-based
programs are usually pretty specific.  It would be nice if there was a
pen protocol that handled angle, pressure and the like, but at the
same time I get the sense that the year to year changes in pens are
pure improvements, which means the protocol would probably need to
express things like pressure and angle in float values.  Otherwise,
you'll quickly reach a place where the pen hardware is reporting more
bits of precision than the protocol can encode.  Or wacom pens will
start reporting rotation as well.

There's also potentially a good argument to be made for supporting
6dof devices, but the bag of devices it would be supporting is...
mixed.  Off the top of my head, you'd be looking at potentially
supporting:

- the wiimote and nunchuk (accelerometers plus camera support)
- the ps3 dual shock controller (accelerometers)
- the ps3 move controller and nav controller (accel. plus camera support)
- the kinect (camera)
- the leap motion (camera)
- the razer hydra (???)
- the 3dconnexxion space pilot, space mouse and space navigator (sensors)

It's hard to boil that down to any kind of common functionality.
I strongly suspect you'd have to have the protocol be
self-descriptive.  The connect message would have to describe the
capabilities of the device.  That would be way better than nothing,
but not a lot of fun to write or use.  You'll run into fun things like
"the capabilities of the wiimote can change drastically at runtime,
depending on what the player decides to plug in to it".  For instance,
the player could yank the nunchuck from the wiimote (removing a stick,
a couple of buttons, and a low-res accelerometer) and replace it with
a classic controller (two sticks, two shoulder triggers, dpad,
buttons) plugged in through a motion plus (high-res auxiliary
accelerometer).  And then pull that and jack the wiimote into a light
gun.

In some cases (kinect, notably) the data coming in is very raw; I
haven't worked with kinect myself, but my understanding is that what
you get is a stream of image pairs; half of each pair is the (possibly
compressed?) RGB camera image, and half of each pair is a depth map
from the infrared camera.  My understanding is that everything else is
signal processing on the host.

At the other end, we have the wiimote pointer, which is also an
infrared camera streaming low-res images, looking at a pair of
infrared LEDs (the two ends of the hilariously misnamed "sensor bar").
 As with the kinect, as far as I understand all the magic happens on
the host in signal processing routines.  The wiimote has
accelerometers as well (especially if you have a motion plus
attached), but the "light gun" screen pointer functionality is
entirely driven by the camera looking at two LEDs a known, fixed
distance apart.

That's the kind of reason I'm evangelizing a standard gamepad
protocol; solving the whole game controller input problem (with force
feedback and adjustable resistance, lights, sound playback, dpi
settings, alternate configurations and all controls made available) is
desirable for some games, but it is also a herculean task both for the
protocol source and for whatever consumes it.  Most games just want
simple mouse or gamepad input.  I'd prefer if the more complex input
is available for games that want to use it, but I think if a game
wants to make full use of unusual hardware then it's reasonable to
expect that game to do the heavy lifting.

> One idea I remember from Irix was that all the analog controls were 
> 1-dimensional. A mouse was actually 2 analog controls. This avoids the need 
> to define how many degrees of freedom a control has, instead it is just N 
> different controls. Quaternions are a problem though because the 4 numbers 
> are not independent so there must be a way to get a set of changes together.

That kind of worked on IRIX because the standard controls were
simple (three button mouse, standard keyboard) and the nonstandard
controls (like those insane video control battleboards) were driven by
very expensive per-seat-per-year cost software that could afford to
burn engineering time building custom support for something that
wouldn't look out of place in the cockpit of something from Gundam.

In practice, trying to write end-user software on self-descriptive
input protocols winds up being a bit of a pain, at least in my
experience.

> Another idea was that buttons had the same api as analog controls, it's just 
> that they only reported 0 or +1, never any fractions (and since it sounds 
> like some controls have pressure-sensitive buttons this may make it easier to 
> use the same code on different controls

Re: Input and games.

2013-04-29 Thread Rick Yorgason
Todd Showalter  writes:
> I think all gamepad input should be routed to whatever has focus
> or whatever has grabbed input.

...and then...

> > I don't think we need a separate event for this, just the normal button
> > event is enough. If the display server wants to eat the event, it can
> > do so in any case. Or was there some other reason for this?
> 
> Mostly to logically separate the home button from the others.
> It's not available on all gamepads.  The gamepads that do have it are
> gamepads for hardware platforms (wii, ps3, xbox 360), and the button's
> purpose is "interrupt the game and bring up the OS".  Pressing it gets
> you access to meta things like system settings, gamepad settings, and
> GUI buttons to do things like quit the game, which is useful in
> single-screen environments where the game is running full-screen.
> 
> Splitting it off isn't essential.

One reason you might want to split the system button into a separate message
is because you might not want it to follow normal focus rules. For instance,
I can imagine Steam's Big Picture using it, but since Steam isn't a
compositor, it wouldn't be able to swallow the button press.

A more generic solution would allow apps to steal all input from individual
buttons, but that's probably overkill. The simplest solution is just to make
it so the system button event is broadcast to all programs, even those
without focus.

Also, I sort of doubt this changes anything, but I just found out that the
OUYA does something strange with home button to compensate for the lack of a
start button: it sends a 'start button' message if you just tap the home
button, but sends a 'system button' message if you hold it. (In the dev kits
you can also double-tap the button, since it seems that the holding
behaviour required a firmware update.)

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Pekka Paalanen
On Mon, 29 Apr 2013 10:44:17 -0700
Bill Spitzak  wrote:

> Has anybody thought about pens (ie wacom tablets)?

Yes, they have been briefly discussed. They need their own interfaces,
just like keyboards and pointers.

- pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Pekka Paalanen
On Mon, 29 Apr 2013 20:04:12 + (UTC)
Rick Yorgason  wrote:

> Todd Showalter  writes:
> > I think all gamepad input should be routed to whatever has focus
> > or whatever has grabbed input.
> 
> ...and then...
> 
> > > I don't think we need a separate event for this, just the normal
> > > button event is enough. If the display server wants to eat the
> > > event, it can do so in any case. Or was there some other reason
> > > for this?
> > 
> > Mostly to logically separate the home button from the others.
> > It's not available on all gamepads.  The gamepads that do have it
> > are gamepads for hardware platforms (wii, ps3, xbox 360), and the
> > button's purpose is "interrupt the game and bring up the OS".
> > Pressing it gets you access to meta things like system settings,
> > gamepad settings, and GUI buttons to do things like quit the game,
> > which is useful in single-screen environments where the game is
> > running full-screen.
> > 
> > Splitting it off isn't essential.
> 
> One reason you might want to split the system button into a separate
> message is because you might not want it to follow normal focus
> rules. For instance, I can imagine Steam's Big Picture using it, but
> since Steam isn't a compositor, it wouldn't be able to swallow the
> button press.

I don't think this helps anything. Events are either swallowed by the
server, or sent to the client in focus. As long as the event is part of
the same wl_gamepad interface, having it separately makes no difference.

> A more generic solution would allow apps to steal all input from
> individual buttons, but that's probably overkill. The simplest
> solution is just to make it so the system button event is broadcast
> to all programs, even those without focus.

Then all those programs need to create the wl_gamepad object. Whether
that works or not, depends on how we design the input event
dispatching, i.e. the gamepad focus.

It might be more sensible to create another protocol extension for
this. If the server does not handle the home button, or if nothing
subscribed the home_interface, then it gets passed to the game as
usual. The whole purpose of home_interface would be to just notify
about the home button presses.

It's a similar case with volume control buttons. If you want them to
control master volume, the display server must intercept them and
either handle them itself, or pass then via a special interface to a
volume control applet. If you want them to control per-app volume by
keyboard focus, you might want to route these buttons to the application
as normal.

What an application could actually do when it receives a home button
press via the special path is an important question. Since Wayland does
not allow random clients to just jump in, you need to specifically
think how to enable a desired response. In that respect, having a third
party program handling the home button is very problematic, since you
probably want something to happen on screen.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Pekka Paalanen
Hi Todd,

you've provided lots of valuable information already. Unfortunately my
input is left as hand-waving, since I cannot dedicate to designing this
protocol myself (as in writing the XML spec).


On Mon, 29 Apr 2013 10:17:31 -0400
Todd Showalter  wrote:

> On Mon, Apr 29, 2013 at 4:15 AM, Pekka Paalanen 
> wrote:
> 
> > a problem here is that to receive a wl_gamepad::connect event, you
> > first have to create a wl_gamepad protocol object, which is a bit
> > counterintuitive.
> >
> > A wl_gamepad protocol object should correspond to a single physical
> > device. So, we would have a wl_gamepad object per each contoller,
> > and you do not need the pad_index argument in any of the events.
> > This should be easier on the game programmer, too, since you can
> > attach user data to each protocol object in libwayland-client, and
> > use that in the callbacks without any global data.
> 
> That's not necessarily how games are designed, though; I know that
> with our engine, input is represented internally as an array of
> gamepads.  It lets us do things like feed a pad with a stack of pad
> values to do things like demos and "replays".

Alright. The remaining reasons are still significant, IMHO: this is the
design pattern for Wayland input devices and even in general, and the
protocol needs to move less bytes when the pad_index is effectively
baked into the object id on the wire.

You can make your own pad_index and store it in the user data field, or
just store a pointer to your table. It's up to you, and you don't need
to deal with someone else's indices.

> > That leaves the question, where do we put the connect and disconnect
> > event equivalents. Below you mention also keyboards, so instead of
> > playing ad-hoc, let's use wl_seats the way they are designed.
> >
> > A wl_seat would need to grow a capability bit for wl_gamepad, and a
> > request to create a wl_gamepad object. When you use that request to
> > create a wl_gamepad, the first thing it does is send its
> > description: name and cookie in your proposal, as an extra event
> > type.
> >
> > That limits us to one wl_gamepad device per wl_seat, so a server
> > needs to create more wl_seats for more controllers. That shouldn't
> > be any problem, these seats would only have the gamepad capability
> > by default.
> 
> This is I think where there's a potential problem.  Gamepads live
> in a somewhat more chaotic world than mice and keyboards; wireless
> ones have much shorter battery lives, and players are used to being
> able to unplug and plug them while playing.  It's not uncommon for
> someone to (say) play for a bit with a gamepad they don't like (maybe
> it was on sale), unplug it, and plug in one they like better.  Or drop
> the gamepad that ran out of batteries in the charger and pull out
> another.
> 
> Players also expect to be able to add a gamepad part way through a
> game, at least for some games.
> 
> So, gamepads can appear and disappear during the game's runtime,
> and the game needs to know that is happening.  There also need to be
> some heuristics about which gamepad is what player (or seat).

So you would rather handle all that in your game, than rely on the
display server to sort it out for you? The user would have to set up
each game, instead of just the display server. The display server could
also drive the player id indicators on some game controllers.

I can easily imagine a game controller configuration GUI for a display
server, where you can register game controllers and assign them to
different seats, just like you would do for keyboards and mice.

I don't really see why each game would need to reimplement this. Is it
just a habit, since there has not been any options?

Doesn't the game and the display server have the exact same problems?
Identifying devices, assigning them to players, ...

Except that when a display server does it, it can also be smart about
input dispatching to different simultaneous clients (games). Games
might be exclusive applications in general, but I don't think we should
encode that into the protocol.

> > If your gamepad actually had a keyboard, or maybe even a touchpad
> > (if that is supposed to be used as a pointer device), it would
> > simply be advertised as a standard wl_keyboard or wl_pointer on the
> > wl_seat. Each player would have their own wl_seat, and it is
> > obvious which keyboard belongs with which gamepad.
> 
> That does solve that problem nicely.  It's somewhat of a corner
> case, though, so I wouldn't move mountains to solve it.
> 
> > Oh, and the disconnect event. The standard wl_seat way for that
> > seems to be a new capabilities event, with the gamepad bit unset.
> 
> Ok.
> 
> > All this still leaves some details unsolved, like which Wayland
> > client should receive the gamepad events on a wl_seat? The one
> > having the keyboard focus? Probably. But what if a wl_seat has no
> > wl_keyboard or wl_pointer? Just the first client that creates a
> > w

Re: Input and games.

2013-04-30 Thread Todd Showalter
On 2013-04-30, at 3:29 AM, Pekka Paalanen  wrote:

> What an application could actually do when it receives a home button
> press via the special path is an important question. Since Wayland does
> not allow random clients to just jump in, you need to specifically
> think how to enable a desired response. In that respect, having a third
> party program handling the home button is very problematic, since you
> probably want something to happen on screen.

Perhaps this is a place for window manager functionality; if there is a 
separate home button message, we could allow a window to request to filter home 
button messages for another window.  So the hypothetical steam behaviour would 
be to request the home button events for any games it spawns.

The advantage of the separate event in that case would be isolating the 
home button event, since I presume that nobody wants the security nightmare of 
arbitrary event filtering. The home button is a system meta event, so it's 
reasonable to have other processes watching it.

I suppose it could be a broadcast event...

 Todd.

--
  Todd Showalter, President
  Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Todd Showalter
On Tue, Apr 30, 2013 at 5:29 AM, Pekka Paalanen  wrote:

> you've provided lots of valuable information already. Unfortunately my
> input is left as hand-waving, since I cannot dedicate to designing this
> protocol myself (as in writing the XML spec).

I'm getting set up to write code.  Someone kindly gave me a bash
script to pull down all the components, so once I get things set up
properly I'll see if I can get a patch together.

>> That's not necessarily how games are designed, though; I know that
>> with our engine, input is represented internally as an array of
>> gamepads.  It lets us do things like feed a pad with a stack of pad
>> values to do things like demos and "replays".
>
> Alright. The remaining reasons are still significant, IMHO: this is the
> design pattern for Wayland input devices and even in general, and the
> protocol needs to move less bytes when the pad_index is effectively
> baked into the object id on the wire.

The question is, is a gamepad an object, or is a *set* of gamepads
an object?

>> This is I think where there's a potential problem.  Gamepads live
>> in a somewhat more chaotic world than mice and keyboards; wireless
>> ones have much shorter battery lives, and players are used to being
>> able to unplug and plug them while playing.  It's not uncommon for
>> someone to (say) play for a bit with a gamepad they don't like (maybe
>> it was on sale), unplug it, and plug in one they like better.  Or drop
>> the gamepad that ran out of batteries in the charger and pull out
>> another.
>>
>> Players also expect to be able to add a gamepad part way through a
>> game, at least for some games.
>>
>> So, gamepads can appear and disappear during the game's runtime,
>> and the game needs to know that is happening.  There also need to be
>> some heuristics about which gamepad is what player (or seat).
>
> So you would rather handle all that in your game, than rely on the
> display server to sort it out for you? The user would have to set up
> each game, instead of just the display server. The display server could
> also drive the player id indicators on some game controllers.

I'd rather the display server sorted it out, honestly, I just
wasn't sure how much policy people were comfortable with pushing into
the display server.

> I can easily imagine a game controller configuration GUI for a display
> server, where you can register game controllers and assign them to
> different seats, just like you would do for keyboards and mice.

I'd prefer something like that.  On the console side of things,
this is a problem that actually usually gets thrown to the game to
solve, and it's always a hassle.

>> I think all gamepad input should be routed to whatever has focus
>> or whatever has grabbed input.  I don't see a scenario where it makes
>> sense to route different gamepads separately unless you're doing
>> multiuser multihead (which I assume is the point of the wl_seat
>> abstraction).
>
> A wl_seat does not relate to any specific output. Each wl_seat on a
> server simply shares all outputs with all other wl_seats.
>
> If you want to do separate sessions, that is each user has his own
> desktop, own input devices, and own outputs, then you pretty much run
> one display server for each user.
>
> wl_seats OTOH allow one to have several people collaborating on the
> same session and desktop, or just one person who needs more than one
> keyboard focus, for instance. This is one display server with several
> wl_seats.
>
> wl_seat is not really a seat in the physical sense. It may be better
> thought via input devices and input foci. One wl_seat has zero or
> one wl_keyboard, wl_pointer, etc. If you have several physical
> keyboards, they all act as one. To have two keyboards independent, they
> are assinged to different wl_seats. Then each keyboard can be typing
> into a different window at the same time.
>
> Focus is per wl_seat, and mostly per input device type. Keyboard and
> pointer have their own foci, wl_touch does not have a focus at all in
> the protocol.
>
> So for games with multiple local players on the same screen, wl_seat
> would be just a player id.
>
> Does this clarify what I was talking about?

Ok, that makes sense.  So, from the game point of view, if each
gamepad lives in its own wl_seat, how does the game detect that new
gamepads have arrived or gone away?  I assume there are wl_seat
create/destroy events?

 Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Pekka Paalanen
On Tue, 30 Apr 2013 08:30:52 -0400
Todd Showalter  wrote:

> On 2013-04-30, at 3:29 AM, Pekka Paalanen  wrote:
> 
> > What an application could actually do when it receives a home button
> > press via the special path is an important question. Since Wayland
> > does not allow random clients to just jump in, you need to
> > specifically think how to enable a desired response. In that
> > respect, having a third party program handling the home button is
> > very problematic, since you probably want something to happen on
> > screen.
> 
> Perhaps this is a place for window manager functionality; if
> there is a separate home button message, we could allow a window to
> request to filter home button messages for another window.  So the
> hypothetical steam behaviour would be to request the home button
> events for any games it spawns.

Hello Todd,

unfortunately that is not how Wayland works at all. All clients are
isolated from the start, regardless how they are spawned. The idea
might be ok, but concepts and protocol design will be very different.

> The advantage of the separate event in that case would be
> isolating the home button event, since I presume that nobody wants
> the security nightmare of arbitrary event filtering. The home button
> is a system meta event, so it's reasonable to have other processes
> watching it.
> 
> I suppose it could be a broadcast event...

As far as I understand, what you want is what I described with the
home_interface, with the exception that there is no filtering based on
the currently focused client. The home_interface is just something an
unrelated client could subscribe to, and then it would get the home
button events.

The third party program handling the home button is a really problematic
case, and this still does not solve the question of what the client
receiving the home button event can actually do. Normally Wayland does
not allow clients to e.g. randomly raise themselves to top, not to
mention do anything to other client's windows, since they cannot even
reference anything from another client. Getting the event is useless,
if you cannot react.

Could we instead design a behaviour for the home button, which the
display server (which is also the window manager) would implement
itself, and which would also satisfy the Steam use case?

For example: the home button minimizes the currently focused fullscreen
window. Or, maybe it brings up the task switcher, if the task switcher
can be controlled by a gamepad. With the task switcher, the user could
select the Steam client, or another concurrent game, or...

I just believe, that sending "the home button was pressed" to all
clients as a "system meta event" is not right. Instead, the server
would intercept it, and send specific events to clients' windows, like
un-fullscreen, un-maximize, minimize, raise, lower, go to sleep, etc.
that all depend on the current shell (i.e. a desktop vs. a game console
GUI vs. a phone vs. a TV vs. ...).

If we solve this in the server, we don't have to note it in the gamepad
protocol at all. In fact, I think we can just postpone this problem,
since the wl_gamepad should be irrelevant to how a home button is
handled. wl_gamepad can perfectly well relay the home button events to
the game, anyway. The question is when and if that happens, and that
does not need to be written down in the specification.

Doesn't really help that I've never used Steam, nor know what Big
Picture is. But I do have a PS3 at home! :-)


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Todd Showalter
On Tue, Apr 30, 2013 at 9:26 AM, Pekka Paalanen  wrote:

> unfortunately that is not how Wayland works at all. All clients are
> isolated from the start, regardless how they are spawned. The idea
> might be ok, but concepts and protocol design will be very different.

I had a feeling that might be the case.

> Doesn't really help that I've never used Steam, nor know what Big
> Picture is. But I do have a PS3 at home! :-)

"Big Picture Mode" (let's call it BP for the purposes of this
description) in Steam is a full-screen program that shows things like
your game list, and is basically a glorified game launcher with some
ancillary functionality like updating and installing games, and store
access.  The important thing for this discussion is that if you are
playing a game you launched from BP, pressing the home button
backgrounds the game and foregrounds BP.  That is, you return to BP
rather than the desktop.

The process is similar to what happens with the home button on the
PS3; the game drops to the background, and the OS puts up an overlay.
Press the home button again to dismiss the overlay and return to the
game.

I'm not going to argue that this is essential behavior; I don't
think it is.  It may be desirable in some cases, but it might also be
desirable to treat the home button totally separately, maybe have it
bring up a gamepad config screen or something if that makes sense.
The main place it's desirable is in the case of living room PCs, where
people will tend to want to be running things like games and movie
players fullscreen, so a standard "get me back to the OS" button is a
useful abstraction.

I do think Valve in particular will want to be able to have some
"hand focus to and raise designated window upon home button press"
functionality.  That said, the game could just trap the home button
itself and hand off to the BP window, assuming that's possible
somehow.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Jason Ekstrand
On Tue, Apr 30, 2013 at 10:25 AM, Todd Showalter wrote:

> On Tue, Apr 30, 2013 at 9:26 AM, Pekka Paalanen 
> wrote:
>
> > unfortunately that is not how Wayland works at all. All clients are
> > isolated from the start, regardless how they are spawned. The idea
> > might be ok, but concepts and protocol design will be very different.
>
> I had a feeling that might be the case.
>
> > Doesn't really help that I've never used Steam, nor know what Big
> > Picture is. But I do have a PS3 at home! :-)
>
> "Big Picture Mode" (let's call it BP for the purposes of this
> description) in Steam is a full-screen program that shows things like
> your game list, and is basically a glorified game launcher with some
> ancillary functionality like updating and installing games, and store
> access.  The important thing for this discussion is that if you are
> playing a game you launched from BP, pressing the home button
> backgrounds the game and foregrounds BP.  That is, you return to BP
> rather than the desktop.
>
> The process is similar to what happens with the home button on the
> PS3; the game drops to the background, and the OS puts up an overlay.
> Press the home button again to dismiss the overlay and return to the
> game.
>
> I'm not going to argue that this is essential behavior; I don't
> think it is.  It may be desirable in some cases, but it might also be
> desirable to treat the home button totally separately, maybe have it
> bring up a gamepad config screen or something if that makes sense.
> The main place it's desirable is in the case of living room PCs, where
> people will tend to want to be running things like games and movie
> players fullscreen, so a standard "get me back to the OS" button is a
> useful abstraction.
>

I think the best way to do it is to simply treat it like an available
hotkey.  If the user wants to configure the home button to do something
special at the compositor level they can do so.  This may include bringing
up the window switcher as pq said, going to desktop, going to the media
center, etc.  Otherwise, it gets passed to the client as a regular button
event.

As far as steam and BigPicture goes, I think they run some sort of an
overlay anyway so that their in-game chat etc. works.  Whatever they use to
handle that could also handle the home button.  I'm not sure what they'll
use to do that in the wayland world.  They may have some sort of embedded
compositor or just a client-side library that all their games include.
Whatever way they do it, they can handle the home button through that.

--Jason Ekstrand
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-04-30 Thread Bill Spitzak

Pekka Paalanen wrote:


 Normally Wayland does
not allow clients to e.g. randomly raise themselves to top


I hope it allows this.

Otherwise clients are going to resort to destroying/recreating their 
surfaces, which is how we work around bugs in Gnome window managers when 
click-to-raise is turned off on them. It makes the windows blink but at 
least we can make a reliable popup dialog.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Pekka Paalanen
On Tue, 30 Apr 2013 09:14:48 -0400
Todd Showalter  wrote:

> On Tue, Apr 30, 2013 at 5:29 AM, Pekka Paalanen 
> wrote:
> 
> > you've provided lots of valuable information already. Unfortunately
> > my input is left as hand-waving, since I cannot dedicate to
> > designing this protocol myself (as in writing the XML spec).
> 
> I'm getting set up to write code.  Someone kindly gave me a bash
> script to pull down all the components, so once I get things set up
> properly I'll see if I can get a patch together.

Excellent!

> >> That's not necessarily how games are designed, though; I know
> >> that with our engine, input is represented internally as an array
> >> of gamepads.  It lets us do things like feed a pad with a stack of
> >> pad values to do things like demos and "replays".
> >
> > Alright. The remaining reasons are still significant, IMHO: this is
> > the design pattern for Wayland input devices and even in general,
> > and the protocol needs to move less bytes when the pad_index is
> > effectively baked into the object id on the wire.
> 
> The question is, is a gamepad an object, or is a *set* of gamepads
> an object?

Both, just like a wl_pointer can be one or more physical mice. Whether a
wl_pointer is backed by several mice, the clients have no way to know,
or separate events by the physical device.

The interfaces are abstract in that sense.

> >> This is I think where there's a potential problem.  Gamepads
> >> live in a somewhat more chaotic world than mice and keyboards;
> >> wireless ones have much shorter battery lives, and players are
> >> used to being able to unplug and plug them while playing.  It's
> >> not uncommon for someone to (say) play for a bit with a gamepad
> >> they don't like (maybe it was on sale), unplug it, and plug in one
> >> they like better.  Or drop the gamepad that ran out of batteries
> >> in the charger and pull out another.
> >>
> >> Players also expect to be able to add a gamepad part way
> >> through a game, at least for some games.
> >>
> >> So, gamepads can appear and disappear during the game's
> >> runtime, and the game needs to know that is happening.  There also
> >> need to be some heuristics about which gamepad is what player (or
> >> seat).
> >
> > So you would rather handle all that in your game, than rely on the
> > display server to sort it out for you? The user would have to set up
> > each game, instead of just the display server. The display server
> > could also drive the player id indicators on some game controllers.
> 
> I'd rather the display server sorted it out, honestly, I just
> wasn't sure how much policy people were comfortable with pushing into
> the display server.

I think we can put lots of policy in the server. A Wayland server is not
just a generic display server like X, but is actually tied to the GUI
paradigms, shell, and the desktop environment. In principle, every DE
will have its own server, and code re-use is punted as an
implementation detail. We prefer to communicate intent (set_fullscreen)
rather than primitive actions (set window size && position it to 0,0 &&
raise).

For example, a window manager with all its policies is just a component
inside a Wayland server. It's also intended to be user configurable,
like a modern DE.

> > I can easily imagine a game controller configuration GUI for a
> > display server, where you can register game controllers and assign
> > them to different seats, just like you would do for keyboards and
> > mice.
> 
> I'd prefer something like that.  On the console side of things,
> this is a problem that actually usually gets thrown to the game to
> solve, and it's always a hassle.

Alright, cool.

> >> I think all gamepad input should be routed to whatever has
> >> focus or whatever has grabbed input.  I don't see a scenario where
> >> it makes sense to route different gamepads separately unless
> >> you're doing multiuser multihead (which I assume is the point of
> >> the wl_seat abstraction).
> >
> > A wl_seat does not relate to any specific output. Each wl_seat on a
> > server simply shares all outputs with all other wl_seats.
> >
> > If you want to do separate sessions, that is each user has his own
> > desktop, own input devices, and own outputs, then you pretty much
> > run one display server for each user.
> >
> > wl_seats OTOH allow one to have several people collaborating on the
> > same session and desktop, or just one person who needs more than one
> > keyboard focus, for instance. This is one display server with
> > several wl_seats.
> >
> > wl_seat is not really a seat in the physical sense. It may be better
> > thought via input devices and input foci. One wl_seat has zero or
> > one wl_keyboard, wl_pointer, etc. If you have several physical
> > keyboards, they all act as one. To have two keyboards independent,
> > they are assinged to different wl_seats. Then each keyboard can be
> > typing into a different window at the same time.
> >
> > F

Re: Input and games.

2013-05-02 Thread Pekka Paalanen
On Tue, 30 Apr 2013 10:30:33 -0500
Jason Ekstrand  wrote:

> On Tue, Apr 30, 2013 at 10:25 AM, Todd Showalter
> wrote:
> 
> > On Tue, Apr 30, 2013 at 9:26 AM, Pekka Paalanen
> >  wrote:
> >
> > > unfortunately that is not how Wayland works at all. All clients
> > > are isolated from the start, regardless how they are spawned. The
> > > idea might be ok, but concepts and protocol design will be very
> > > different.
> >
> > I had a feeling that might be the case.
> >
> > > Doesn't really help that I've never used Steam, nor know what Big
> > > Picture is. But I do have a PS3 at home! :-)
> >
> > "Big Picture Mode" (let's call it BP for the purposes of this
> > description) in Steam is a full-screen program that shows things
> > like your game list, and is basically a glorified game launcher
> > with some ancillary functionality like updating and installing
> > games, and store access.  The important thing for this discussion
> > is that if you are playing a game you launched from BP, pressing
> > the home button backgrounds the game and foregrounds BP.  That is,
> > you return to BP rather than the desktop.
> >
> > The process is similar to what happens with the home button on
> > the PS3; the game drops to the background, and the OS puts up an
> > overlay. Press the home button again to dismiss the overlay and
> > return to the game.
> >
> > I'm not going to argue that this is essential behavior; I don't
> > think it is.  It may be desirable in some cases, but it might also
> > be desirable to treat the home button totally separately, maybe
> > have it bring up a gamepad config screen or something if that makes
> > sense. The main place it's desirable is in the case of living room
> > PCs, where people will tend to want to be running things like games
> > and movie players fullscreen, so a standard "get me back to the OS"
> > button is a useful abstraction.
> >
> 
> I think the best way to do it is to simply treat it like an available
> hotkey.  If the user wants to configure the home button to do
> something special at the compositor level they can do so.  This may
> include bringing up the window switcher as pq said, going to desktop,
> going to the media center, etc.  Otherwise, it gets passed to the
> client as a regular button event.
> 
> As far as steam and BigPicture goes, I think they run some sort of an
> overlay anyway so that their in-game chat etc. works.  Whatever they
> use to handle that could also handle the home button.  I'm not sure
> what they'll use to do that in the wayland world.  They may have some
> sort of embedded compositor or just a client-side library that all
> their games include. Whatever way they do it, they can handle the
> home button through that.

Yes, I agree.

Even if BP was not a nesting compositor, making the home button
minimize the active window would usually get you to the BP right under
it. The task switcher would be more reliable, though, and also allow to
get back to the game. It is all mostly a question of making the
Wayland server or the DE controllable with a gamepad.

In summary, I don't think we need to treat the home button specially in
the protocol.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Todd Showalter
On Thu, May 2, 2013 at 5:44 AM, Pekka Paalanen  wrote:
> On Tue, 30 Apr 2013 09:14:48 -0400
> Todd Showalter  wrote:
>
>> I'm getting set up to write code.  Someone kindly gave me a bash
>> script to pull down all the components, so once I get things set up
>> properly I'll see if I can get a patch together.
>
> Excellent!

The day job is interfering a bit, but I'm hoping to be able to
start working on this shortly.

>> The question is, is a gamepad an object, or is a *set* of gamepads
>> an object?
>
> Both, just like a wl_pointer can be one or more physical mice. Whether a
> wl_pointer is backed by several mice, the clients have no way to know,
> or separate events by the physical device.
>
> The interfaces are abstract in that sense.

Right.  From a game point of view, we don't want to do the
conflated-device thing; it makes some sense to have two mice
controlling a single pointer on a single device (the thinkpad nub
mouse + usb mouse case), but it never makes sense to have multiple
gamepads generating events for a single virtual gamepad.  The game
needs to be able to tell them apart.

>> I'd rather the display server sorted it out, honestly, I just
>> wasn't sure how much policy people were comfortable with pushing into
>> the display server.
>
> I think we can put lots of policy in the server. A Wayland server is not
> just a generic display server like X, but is actually tied to the GUI
> paradigms, shell, and the desktop environment. In principle, every DE
> will have its own server, and code re-use is punted as an
> implementation detail. We prefer to communicate intent (set_fullscreen)
> rather than primitive actions (set window size && position it to 0,0 &&
> raise).

Ok, good.

> For example, a window manager with all its policies is just a component
> inside a Wayland server. It's also intended to be user configurable,
> like a modern DE.

Fair enough.  So, I'll need to fork Weston if I want to build my
fever dream combo of Quicksilver and Sawfish, then.  :)

>> Ok, that makes sense.  So, from the game point of view, if each
>> gamepad lives in its own wl_seat, how does the game detect that new
>> gamepads have arrived or gone away?  I assume there are wl_seat
>> create/destroy events?
>
> wl_seats are global objects in the protocol, and yes, we have events for
> globals to come and go dynamically. The events are in the wl_registry
> interface.

Ok, so in principle the game just watches for wl_seats appearing
and disappearing, and checks to see if they have gamepads attached.

> If just a gamepad goes away and later comes back, the wl_seat could
> even stay around in between. There can also be seats without a gamepad,
> so it is still the game's responsibility to decide which wl_seats it
> takes as players.

This is the icky problem for whoever handles it.  If a gamepad
disappears and then appears again attached to a different usb port, or
if a gamepad disappears and a different pad appears at the port where
the old one was, is it the same wl_seat?

> Which reminds me: maybe we should add a name string event to wl_seat
> interface? This way a game, if need be, can list the seats by name
> given by the user, and the user can then pick which ones are actual
> players. (It is a standard procedure to send initial state of an object
> right after binding/creating it.) I imagine it might be useful for other
> apps, too.
>
> Unless it's enough to just pick the wl_seats that have a gamepad?
>
> Hmm, is this actually any better than just handing all gamepads
> individually without any wl_seats, and let the game sort sort them out?
> How far can we assume that a wl_seat == a player, for *every*
> existing wl_seat? And which player is which wl_seat?

That's why I was assuming originally that gamepads would all be
attached to a single wl_seat and come in with pad_index values.
However it winds up getting wrapped in protocol, what the game is
interested in (if it cares about more than one gamepad, which it may
not) is figuring out when those gamepads appear and disappear, how
they map to players, and what input each player is generating.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Rick Yorgason
Pekka Paalanen  writes:
> Yes, I agree.
> 
> Even if BP was not a nesting compositor, making the home button
> minimize the active window would usually get you to the BP right under
> it. The task switcher would be more reliable, though, and also allow to
> get back to the game. It is all mostly a question of making the
> Wayland server or the DE controllable with a gamepad.
> 
> In summary, I don't think we need to treat the home button specially in
> the protocol.

I've been looking into how the Steam overlay works on Linux, and it seems to
inject code into the target app using LD_PRELOAD. This is how it draws its
interface over the game, and how it intercepts the Shift+Tab shortcut to
open the overlay, and I see no reason why this technique wouldn't also work
for the home button, so I rescind the idea that making it a global message
might be useful.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Daniel Stone
Hi,

On 2 May 2013 10:44, Pekka Paalanen  wrote:
> On Tue, 30 Apr 2013 09:14:48 -0400
> Todd Showalter  wrote:
>> The question is, is a gamepad an object, or is a *set* of gamepads
>> an object?
>
> Both, just like a wl_pointer can be one or more physical mice. Whether a
> wl_pointer is backed by several mice, the clients have no way to know,
> or separate events by the physical device.
>
> The interfaces are abstract in that sense.

There's one crucial difference though, and one that's going to come up
when we address graphics tablets / digitisers too.  wl_pointer works
as a single interface because no matter how many mice are present, you
can aggregate them together and come up with a sensible result: they
all move the sprite to one location.  wl_touch fudges around this by
essentially asserting that not only will you generally only have one
direct touchscreen, but it provides for multiple touches, so you can
pretend one touch each on multiple screens, are multiple touches on a
single screen.

The gamepad interaction doesn't have this luxury, and neither do
tablets.  I don't think splitting them out to separate seats is the
right idea though: what if (incoming stupid hypothetical alert) you
had four people on a single system, each with their own keyboards and
gamepads.  Kind of like consoles are today, really.  Ideally, you'd
want an association between the keyboards and gamepads, which would be
impossible if every gamepad had one separate wl_seat whose sole job
was to nest it.

I think it'd be better to, instead of wl_seat::get_gamepad returning a
single new_id wl_gamepad, as wl_pointer/etc do it today, have
wl_seat::get_gamepads, which would send one wl_seat::gamepad event
with a new_id wl_gamepad, for every gamepad which was there or
subsequently added.  That way we keep the seat association, but can
still deal with every gamepad individually.

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-02 Thread Pekka Paalanen
On Thu, 2 May 2013 18:18:27 + (UTC)
Rick Yorgason  wrote:

> Pekka Paalanen  writes:
> > Yes, I agree.
> > 
> > Even if BP was not a nesting compositor, making the home button
> > minimize the active window would usually get you to the BP right
> > under it. The task switcher would be more reliable, though, and
> > also allow to get back to the game. It is all mostly a question of
> > making the Wayland server or the DE controllable with a gamepad.
> > 
> > In summary, I don't think we need to treat the home button
> > specially in the protocol.
> 
> I've been looking into how the Steam overlay works on Linux, and it
> seems to inject code into the target app using LD_PRELOAD. This is
> how it draws its interface over the game, and how it intercepts the
> Shift+Tab shortcut to open the overlay,

Uh oh, yuk...

I wonder if one would have serious trouble achieving the same on
Wayland. X is so much more liberal on what one can do wrt. protocol and
the C API. For instance, in X I believe one can query a lot of stuff
from the server, in Wayland nothing. In X a window reference is just an
integer, and if you get something wrong, I think you get an error that
you can choose to handle non-fatally. In Wayland, you have a pointer,
that means you are susceptible to use-after-free and segfaults, and if
you do something wrong, the server disconnects the whole client on the
spot.

> and I see no reason why this
> technique wouldn't also work for the home button, so I rescind the
> idea that making it a global message might be useful.

It depends very much on the API that LD_PRELOAD lib is intercepting,
whether it is just ugly or near impossible. I'm hugely sceptical.


Cheers,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Pekka Paalanen
On Thu, 2 May 2013 19:28:41 +0100
Daniel Stone  wrote:

> Hi,
> 
> On 2 May 2013 10:44, Pekka Paalanen  wrote:
> > On Tue, 30 Apr 2013 09:14:48 -0400
> > Todd Showalter  wrote:
> >> The question is, is a gamepad an object, or is a *set* of
> >> gamepads an object?
> >
> > Both, just like a wl_pointer can be one or more physical mice.
> > Whether a wl_pointer is backed by several mice, the clients have no
> > way to know, or separate events by the physical device.
> >
> > The interfaces are abstract in that sense.
> 
> There's one crucial difference though, and one that's going to come up
> when we address graphics tablets / digitisers too.  wl_pointer works
> as a single interface because no matter how many mice are present, you
> can aggregate them together and come up with a sensible result: they
> all move the sprite to one location.  wl_touch fudges around this by
> essentially asserting that not only will you generally only have one
> direct touchscreen, but it provides for multiple touches, so you can
> pretend one touch each on multiple screens, are multiple touches on a
> single screen.

Right. Could we just say that each such non-aggregatable device must be
put into a wl_seat that does not already have such a device?
Or make that an implementors guideline rather than a hard requirement
in the protocol spec.

> The gamepad interaction doesn't have this luxury, and neither do
> tablets.  I don't think splitting them out to separate seats is the
> right idea though: what if (incoming stupid hypothetical alert) you
> had four people on a single system, each with their own keyboards and
> gamepads.  Kind of like consoles are today, really.  Ideally, you'd
> want an association between the keyboards and gamepads, which would be
> impossible if every gamepad had one separate wl_seat whose sole job
> was to nest it.

So... what's wrong in putting each keyboard into the wl_seat where it
belongs, along with the gamepad?

> I think it'd be better to, instead of wl_seat::get_gamepad returning a
> single new_id wl_gamepad, as wl_pointer/etc do it today, have
> wl_seat::get_gamepads, which would send one wl_seat::gamepad event
> with a new_id wl_gamepad, for every gamepad which was there or
> subsequently added.  That way we keep the seat association, but can
> still deal with every gamepad individually.

It would be left for the client to decide which gamepad it wants from
which wl_seat, right?

Do we want to force all clients to choose every non-aggregatable device
this way?

Essentially, that would mean that wl_seat are just for the traditional
keyboard & mouse (and touchscreen so far) association, and then
everything else would be left for each client to assign to different
wl_seats on their own. This seems strange. Why do we need a wl_seat
then, why not do the same with keyboards and mice?

Oh right, focus. You want to be able to control keyboard focus with a
pointer. Why is a gamepad focus different? Would all gamepads follow
the keyboard focus? If there are several wl_seats with kbd & ptr, which
keyboard focus do they follow? What if the same gamepad is left active
in more than one wl_seat? What if there is no keyboard or pointer, e.g.
you had only a touchscreen and two gamepads (say, IVI)?

And then replace every "gamepad" with "digitizer", and all other
non-aggregatable input devices, and also all raw input devices via
evdev fd passing. The fd passing I believe has similar problems: who
gets the events, which wl_seat do they follow.

This is a new situation, and so many open questions... I just continued
on the exitisting pattern.


Cheers,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Pekka Paalanen
On Thu, 2 May 2013 10:46:56 -0400
Todd Showalter  wrote:

> On Thu, May 2, 2013 at 5:44 AM, Pekka Paalanen 
> wrote:
> > On Tue, 30 Apr 2013 09:14:48 -0400
> > Todd Showalter  wrote:
> >
...
> >> The question is, is a gamepad an object, or is a *set* of
> >> gamepads an object?
> >
> > Both, just like a wl_pointer can be one or more physical mice.
> > Whether a wl_pointer is backed by several mice, the clients have no
> > way to know, or separate events by the physical device.
> >
> > The interfaces are abstract in that sense.
> 
> Right.  From a game point of view, we don't want to do the
> conflated-device thing; it makes some sense to have two mice
> controlling a single pointer on a single device (the thinkpad nub
> mouse + usb mouse case), but it never makes sense to have multiple
> gamepads generating events for a single virtual gamepad.  The game
> needs to be able to tell them apart.

Indeed, that's why I proposed to put them in separate wl_seats. It
doesn't make a difference on the protocol level.

...
> > If just a gamepad goes away and later comes back, the wl_seat could
> > even stay around in between. There can also be seats without a
> > gamepad, so it is still the game's responsibility to decide which
> > wl_seats it takes as players.
> 
> This is the icky problem for whoever handles it.  If a gamepad
> disappears and then appears again attached to a different usb port, or
> if a gamepad disappears and a different pad appears at the port where
> the old one was, is it the same wl_seat?

Yup. Whatever we do, we get it wrong for someone, so there needs to be
a GUI to fix it. But should that GUI be all games' burden, or servers'
burden...

Along with the GUI is the burden of implementing the default
heuristics, which may require platform specific information.

> > Which reminds me: maybe we should add a name string event to wl_seat
> > interface? This way a game, if need be, can list the seats by name
> > given by the user, and the user can then pick which ones are actual
> > players. (It is a standard procedure to send initial state of an
> > object right after binding/creating it.) I imagine it might be
> > useful for other apps, too.
> >
> > Unless it's enough to just pick the wl_seats that have a gamepad?
> >
> > Hmm, is this actually any better than just handing all gamepads
> > individually without any wl_seats, and let the game sort sort them
> > out? How far can we assume that a wl_seat == a player, for *every*
> > existing wl_seat? And which player is which wl_seat?
> 
> That's why I was assuming originally that gamepads would all be
> attached to a single wl_seat and come in with pad_index values.
> However it winds up getting wrapped in protocol, what the game is
> interested in (if it cares about more than one gamepad, which it may
> not) is figuring out when those gamepads appear and disappear, how
> they map to players, and what input each player is generating.

Right.

I can summarize my question to this:

Which one is better for the end user: have device assingment to seats
heuristics and GUI in the server, and seats to players mapping GUI
in every game; or have it all in every game?

Or can or should we design the protocol to allow both ways? Even if
gamepads are in different wl_seats, the game is still free to mix and
match.

I have come to a point where I can only ask more questions, without
good suggestions.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 3:34 AM, Pekka Paalanen  wrote:

> Yup. Whatever we do, we get it wrong for someone, so there needs to be
> a GUI to fix it. But should that GUI be all games' burden, or servers'
> burden...
>
> Along with the GUI is the burden of implementing the default
> heuristics, which may require platform specific information.

I don't know that you need a GUI to fix it as long as you're
willing to lay down some policy.  We could go with basic heuristics:

- if a gamepad unplugs from a specific usb port and some other gamepad
re-plugs in the same port before any other gamepads appear, it's the
same player

- if a gamepad unplugs from a specific usb port and then appears in
another before any other gamepads appear, it's the same player

- otherwise, you get whatever mad order falls out of the code

I think that covers the common case; if people start swapping
multiple controllers around between ports, they might have to re-jack
things to get the gamepad->player mapping they like, but that's going
to be rare.

> I can summarize my question to this:
>
> Which one is better for the end user: have device assingment to seats
> heuristics and GUI in the server, and seats to players mapping GUI
> in every game; or have it all in every game?

Heuristics mean less work for the player and behaviour the player
can learn to anticipate.  I say go with that.  I think the moment you
present people with a gui plugboard and ask them to patch-cable
controllers to player IDs, you're in a bad place.

I could see it being an advanced option that a savvy player could
bring up to fix things without rejacking the hardware, but the less
technically savvy are going to have a far easier time just physically
unplugging and replugging gamepads than they are figuring out a GUI
they've never (or rarely) seen before.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Pekka Paalanen
On Fri, 3 May 2013 03:51:33 -0400
Todd Showalter  wrote:

> On Fri, May 3, 2013 at 3:34 AM, Pekka Paalanen 
> wrote:
> 
> > Yup. Whatever we do, we get it wrong for someone, so there needs to
> > be a GUI to fix it. But should that GUI be all games' burden, or
> > servers' burden...
> >
> > Along with the GUI is the burden of implementing the default
> > heuristics, which may require platform specific information.
> 
> I don't know that you need a GUI to fix it as long as you're
> willing to lay down some policy.  We could go with basic heuristics:
> 
> - if a gamepad unplugs from a specific usb port and some other gamepad
> re-plugs in the same port before any other gamepads appear, it's the
> same player
> 
> - if a gamepad unplugs from a specific usb port and then appears in
> another before any other gamepads appear, it's the same player
> 
> - otherwise, you get whatever mad order falls out of the code
> 
> I think that covers the common case; if people start swapping
> multiple controllers around between ports, they might have to re-jack
> things to get the gamepad->player mapping they like, but that's going
> to be rare.

Sure, the heuristics can cover a lot, but there is still the mad case,
and also the initial setup (system started with 3 new gamepads hooked
up), where one may want to configure manually. The GUI is just my
reminder, that sometimes it is necessary to configure manually, and
there must be some way to do it when wanted.

Even if it's just "press the home button in one gamepad at a time, to
assing players 1 to N."

> > I can summarize my question to this:
> >
> > Which one is better for the end user: have device assingment to
> > seats heuristics and GUI in the server, and seats to players
> > mapping GUI in every game; or have it all in every game?
> 
> Heuristics mean less work for the player and behaviour the player
> can learn to anticipate.  I say go with that.  I think the moment you
> present people with a gui plugboard and ask them to patch-cable
> controllers to player IDs, you're in a bad place.
> 
> I could see it being an advanced option that a savvy player could
> bring up to fix things without rejacking the hardware, but the less
> technically savvy are going to have a far easier time just physically
> unplugging and replugging gamepads than they are figuring out a GUI
> they've never (or rarely) seen before.

Well, yes. But the question was not whether we should have heuristics
or a GUI. The question is, do we want the heuristics *and* the GUI in
the server or the games? The GUI is a fallback, indeed, for those who
want it, and so is also the wl_seat-player mapping setup in a game.

If we do the heuristics in the server, there is very little we have to
do in the protocol for it. Maybe just allow to have human-readable
names for wl_seats. The "press home button to assign players" would be
easy to implement. The drawback is that the server's player 1 might not
be the game's player 1, so we need some thought to make them match.

If we do the heuristics in the games, we have to think about what
meta data of the gamepads we need to transmit. You said something about
a hash of some things before. If we have just a single hash, we cannot
implement the heuristics you described above, so it will need some
thought. Also, if we want to drive things like player id lights in
gamepads, that needs to be considered in the protocol.

Maybe there could be some scheme, where we would not need to have the
wl_seat<->player mapping configurable in games after all, if one goes
with server side heuristics. There are also the things Daniel wrote
about, which link directly to what we can do.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 6:42 AM, Pekka Paalanen  wrote:

> Sure, the heuristics can cover a lot, but there is still the mad case,
> and also the initial setup (system started with 3 new gamepads hooked
> up), where one may want to configure manually. The GUI is just my
> reminder, that sometimes it is necessary to configure manually, and
> there must be some way to do it when wanted.
>
> Even if it's just "press the home button in one gamepad at a time, to
> assing players 1 to N."

If there's going to be a gamepad setup gui, my preference would be
for it to be a system thing rather than a game thing.  Partly because
I'm lazy/cheap and don't want to have to do themed versions of it for
every game I do, but also partly because otherwise it's something else
that someone can half-ass or get wrong.

> Well, yes. But the question was not whether we should have heuristics
> or a GUI. The question is, do we want the heuristics *and* the GUI in
> the server or the games? The GUI is a fallback, indeed, for those who
> want it, and so is also the wl_seat-player mapping setup in a game.
>
> If we do the heuristics in the server, there is very little we have to
> do in the protocol for it. Maybe just allow to have human-readable
> names for wl_seats. The "press home button to assign players" would be
> easy to implement. The drawback is that the server's player 1 might not
> be the game's player 1, so we need some thought to make them match.
>
> If we do the heuristics in the games, we have to think about what
> meta data of the gamepads we need to transmit. You said something about
> a hash of some things before. If we have just a single hash, we cannot
> implement the heuristics you described above, so it will need some
> thought. Also, if we want to drive things like player id lights in
> gamepads, that needs to be considered in the protocol.
>
> Maybe there could be some scheme, where we would not need to have the
> wl_seat<->player mapping configurable in games after all, if one goes
> with server side heuristics. There are also the things Daniel wrote
> about, which link directly to what we can do.

I vote do it on the server, however it winds up being done.  It
means the client is isolated from a whole bunch of things it would
otherwise need to explicitly support, and it means that things happen
consistently between games.  It also means that any bugs in the
process will be addressable without shipping a new build of the game.

 Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Pekka Paalanen
On Fri, 3 May 2013 09:12:20 -0400
Todd Showalter  wrote:

> On Fri, May 3, 2013 at 6:42 AM, Pekka Paalanen  wrote:
> 
> > Sure, the heuristics can cover a lot, but there is still the mad case,
> > and also the initial setup (system started with 3 new gamepads hooked
> > up), where one may want to configure manually. The GUI is just my
> > reminder, that sometimes it is necessary to configure manually, and
> > there must be some way to do it when wanted.
> >
> > Even if it's just "press the home button in one gamepad at a time, to
> > assing players 1 to N."
> 
> If there's going to be a gamepad setup gui, my preference would be
> for it to be a system thing rather than a game thing.  Partly because
> I'm lazy/cheap and don't want to have to do themed versions of it for
> every game I do, but also partly because otherwise it's something else
> that someone can half-ass or get wrong.
> 
> > Well, yes. But the question was not whether we should have heuristics
> > or a GUI. The question is, do we want the heuristics *and* the GUI in
> > the server or the games? The GUI is a fallback, indeed, for those who
> > want it, and so is also the wl_seat-player mapping setup in a game.
> >
> > If we do the heuristics in the server, there is very little we have to
> > do in the protocol for it. Maybe just allow to have human-readable
> > names for wl_seats. The "press home button to assign players" would be
> > easy to implement. The drawback is that the server's player 1 might not
> > be the game's player 1, so we need some thought to make them match.
> >
> > If we do the heuristics in the games, we have to think about what
> > meta data of the gamepads we need to transmit. You said something about
> > a hash of some things before. If we have just a single hash, we cannot
> > implement the heuristics you described above, so it will need some
> > thought. Also, if we want to drive things like player id lights in
> > gamepads, that needs to be considered in the protocol.
> >
> > Maybe there could be some scheme, where we would not need to have the
> > wl_seat<->player mapping configurable in games after all, if one goes
> > with server side heuristics. There are also the things Daniel wrote
> > about, which link directly to what we can do.
> 
> I vote do it on the server, however it winds up being done.  It
> means the client is isolated from a whole bunch of things it would
> otherwise need to explicitly support, and it means that things happen
> consistently between games.  It also means that any bugs in the
> process will be addressable without shipping a new build of the game.

Cool, I agree with that. :-)


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Daniel Stone
Hi,

On 20 April 2013 22:13, Nick Kisialiou  wrote:
> Generic device input may be too complicated to put it into Wayland protocol.
> For example, take Razer Hydra controller:
> http://www.engadget.com/2011/06/08/razer-totes-hydra-sticks-and-6400dpi-dual-sensor-mice-to-e3-2011/
>
> There are 2 USB connected controllers for each hand, each with 6 DOF
> information for 3D position and 3D rotation information. I programmed it for
> a 3D environment rather than games. Each controller sends you a quaternion
> to extract the data. On top of it, the output is noisy, so you'd want to add
> filters to integrate the noise out.
>
> The last thing I'd want is to have a middleman between the USB port and my
> processing code that messes around with rotation matrices and introduces
> delays. I think it is reasonable to limit the protocol to mice like devices
> only. As long as the protocol allows 2 mice simultaneously in the system
> (which they do), IMHO, the rest of the processing is better placed within
> your own code.

I think with 6DoF-type devices, we really shouldn't try to do anything
clever with them, and pretty much just pass evdev input through.  The
only reason we created wl_pointer and wl_keyboard as they are is that
the compositor needs to interpret and intercept them, and clients
would all be doing more or less the same interpretation too.  For
complex devices where it's of no benefit to have the compositor
rewrite the events, I think we just shouldn't even try.

If the gamepad proposal was any more complex than it is now, I'd lean
towards just shuttling the raw data to clients rather than having our
own protocol.  But the proposal I've seen is pretty nice and it
definitely helps our gaming story (which is really quite poor now), so
that helps.

The one thing I think it's missing so far is physical controller gyro
measurements, e.g. for new PS3/PS4 controllers and the Wiimote.

Cheers,
Daniel

> On Sat, Apr 20, 2013 at 9:38 AM, Todd Showalter 
> wrote:
>>
>> On Sat, Apr 20, 2013 at 12:20 PM, Daniel  wrote:
>>
>> > This is useful for desktop software too. I'm thinking of Stellarium or
>> > Google Earth, where moving the mouse is expected to move the
>> > environment, not the pointer itself.
>>
>> "Games" is really perhaps shorthand here; there are a lot of tools
>> and so forth that have similar behavior and operating requirements to
>> games, but aren't strictly games per se.  If you have an architectural
>> walkthrough program that lets you navigate a building and make
>> alterations, that's not really something you'd call a game, but it is
>> operating under many of the same constraints.  It's more obvious in
>> things using 3D, but even the 2D side can use it in places.
>>
>> I could easily see (for example) wanting to be able to do drag &
>> drop within a window on a canvas larger than the window can display;
>> say it's something like dia or visio or the like.  I drag an icon from
>> the sidebar into the canvas, and if it gets to the edge of the canvas
>> window the canvas scrolls and the dragged object (and the pointer)
>> parks at the window edge.
>>
>> It's useful behavior.  I can definitely see why adding it to the
>> protocol makes things more annoying, but I've a strong suspicion it's
>> one of those things that if you leave it out you'll find that down the
>> road there's a lot of pressure to find a way to hack it in.
>>
>> Todd.
>>
>> --
>>  Todd Showalter, President,
>>  Electron Jump Games, Inc.
>> ___
>> wayland-devel mailing list
>> wayland-devel@lists.freedesktop.org
>> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
>
>
>
> ___
> wayland-devel mailing list
> wayland-devel@lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
>
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Daniel Stone
Hi,

On 19 April 2013 10:18, Pekka Paalanen  wrote:
> Keyboards already have extensive mapping capabilities. A Wayland server
> sends keycodes (I forget in which space exactly) and a keymap, and
> clients feed the keymap and keycodes into libxkbcommon, which
> translates them into something actually useful. Maybe something similar
> could be invented for game controllers? But yes, this is off-topic for
> Wayland, apart from the protocol of what event codes and other data to
> pass.

It's worth noting that the only reason libxkbcommon exists is because
there's just no way to express it generically.  People want to have
Cyrillic and US keymaps active where Ctrl + W triggers 'close window'
regardless of which keymap's active.  But if they have Cyrillic, US
and Icelandic Dvorak active, they want Ctrl + W to trigger for Ctrl +
(wherever W is in Icelandic Dvorak) when in Icelandic Dvorak, and Ctrl
+ W when it's in Cyrillic and US.  And so on, and so forth.

If it was possible to just use wl_text all the way, I would never have
written that bastard library.  But you can't win them all.

Cheers,
Daniel

>> Event Driven vs. Polling
>>
>> Modern gui applications tend to be event-driven, which makes
>> sense; most modern desktop applications spend most of their time doing
>> nothing and waiting for the user to generate input.  Games are
>> different, in that they tend to be simulation-based, and things are
>> happening regardless of whether the player is providing input.
>>
>> In most games, you have to poll input between simulation ticks.
>> If you accept and process an input event in the middle of a simulation
>> tick, your simulation will likely be internally inconsistent.  Input
>> in games typically moves or changes in-game objects, and if input
>> affects an object mid-update, part of the simulation tick will have
>> been calculated based on the old state of the object, and the rest
>> will be based on the new state.
>>
>> To deal with this on event-driven systems, games must either
>> directly poll the input system, or else accumulate events and process
>> them between simulation ticks.  Either works, but being able to poll
>> means the game needs to do less work.
>
> Wayland protocol in event driven. Polling does not make sense, since it
> would mean a synchronous round-trip to the server, which for something
> like this is just far too expensive, and easily (IMHO) worked around.
>
> So, you have to maintain input state yourself, or by a library you use.
> It could even be off-loaded to another thread.
>
> There is also a huge advantage over polling: in an event driven design,
> it is impossible to miss very fast, transient actions, which polling
> would never notice. And whether you need to know if such a transient
> happened, or how many times is happened, or how long time each
> transient took between two game ticks, is all up to you and available.
>
> I once heard about some hardcore gamer complaining, that in some
> systems or under some conditions, probably related to the
> ridiculous framerates gamers usually demand, the button sequence he hits
> in a fraction of a second is not registered properly, and I was
> wondering how is it possible for it to not register properly. Now I
> realised a possible cause: polling.
>
> Event driven is a little more work for the "simple" games, but it gives
> you guarantees. Would you not agree?
>
>> Input Sources & Use
>>
>> Sometimes games want desktop-style input (clicking buttons,
>> entering a name with the keyboard), but often games want to treat all
>> the available input data as either digital values (mouse buttons,
>> keyboard keys, gamepad buttons...), constrained-axis "analog" (gamepad
>> triggers, joysticks) or unconstrained axis "analog" (mouse/trackball).
>>  Touch input is a bit of a special case, since it's nearly without
>> context.
>
> Is this referring to the problem of "oops, my mouse left the Quake
> window when I tried to turn"? Or maybe more of "oops, the pointer hit
> the monitor edge and I cannot turn any more?" I.e. absolute vs.
> relative input events?
>
> There is a relative motion events proposal for mice:
> http://lists.freedesktop.org/archives/wayland-devel/2013-February/007635.html
>
> Clients cannot warp the pointer, so there is no way to hack around it.
> We need to explicitly support it.
>
>> Games usually care about all of:
>>
>> - the state of buttons/keys -- whether they are currently down or up
>> -- think WASD here
>> - edge detection of buttons/keys -- trigger, release and state change
>> - the value of each input axis -- joystick deflection, screen position
>> of the cursor, etc
>> - the delta of each input axis
>>
>> From what I've seen, SDL does not give us the button/key state
>> without building a layer on top of it; we only get edge detection.
>> Likewise, as far as I understand nothing does deltas.
>
> Ah yes, deltas are the relative motion events, see above.
>
>> Input Capture
>>
>> It would be

Re: Input and games.

2013-05-03 Thread Daniel Stone
Hi,

On 21 April 2013 06:28, Todd Showalter  wrote:
> On Fri, Apr 19, 2013 at 7:08 PM, Bill Spitzak  wrote:
>> I think this is going to require pointer warping. At first I thought it
>> could be done by hiding the pointer and faking it's position, but that would
>> not stop the invisible pointer from moving out of the window and becoming
>> visible, or moving into a hot-spot and triggering an unexpected effect.
>
> I think edge resistance/edge snapping really wants pointer warping as 
> well.

It's really difficult to achieve a nicely responsive and fluid UI
(i.e. doing this without jumps) when you're just warping the pointer.
To be honest, I'd prefer to see an interface where, upon a click, you
could set an acceleration (deceleration) factor which was valid for
the duration of that click/drag only.  We already have drag & drop
working kind of like this, so it's totally possible to do for relative
(i.e. wl_pointer) devices.  The only two usecases I've seen come up
for pointer warping are this and pointer confinement, which I'd rather
do specifically than through warping - which is a massive minefield I
really, really want to avoid.

But it's also a totally orthogonal discussion. :)

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Daniel Stone
Hi,

On 29 April 2013 18:44, Bill Spitzak  wrote:
> Has anybody thought about pens (ie wacom tablets)? These have 5 degrees of
> freedom (most cannot distinguish rotation about the long axis of the pen).
> There are also spaceballs with full 6 degrees of freedom.

As Todd said, these really need to be their own interface.  From a
purely abstract point of view, they kind of look the same, but if you
have one interface to represent everything, that interface ends up
looking a lot like XI2, which no-one uses because it's a monumental
pain in the behind.

The biggest blocker though, is that the compositor addresses gamepads
and tablets completely differently.  Tablets have particular focus,
and their co-ordinates need to be interpreted and scaled to
surface-local, whereas gamepads are going to have co-ordinates in a
totally different space, which is sometimes angular rather than
positional in a 2D space.

So, again, a tablet interface is a very useful thing to have - and
it's a discussion we absolutely need to have at some point - but it
has no bearing at all on this one.  A good starting point would be to
look at the X.Org Wacom driver and its capability set.

> Another idea was that buttons had the same api as analog controls, it's just
> that they only reported 0 or +1, never any fractions (and since it sounds
> like some controls have pressure-sensitive buttons this may make it easier
> to use the same code on different controls).

I think this falls into the same over-generalising trap, where you
look like a smart alec but don't produce anything near like a usable
API.

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Daniel Stone
Hi,

On 3 May 2013 08:17, Pekka Paalanen  wrote:
> On Thu, 2 May 2013 19:28:41 +0100
> Daniel Stone  wrote:
>> There's one crucial difference though, and one that's going to come up
>> when we address graphics tablets / digitisers too.  wl_pointer works
>> as a single interface because no matter how many mice are present, you
>> can aggregate them together and come up with a sensible result: they
>> all move the sprite to one location.  wl_touch fudges around this by
>> essentially asserting that not only will you generally only have one
>> direct touchscreen, but it provides for multiple touches, so you can
>> pretend one touch each on multiple screens, are multiple touches on a
>> single screen.
>
> Right. Could we just say that each such non-aggregatable device must be
> put into a wl_seat that does not already have such a device?
> Or make that an implementors guideline rather than a hard requirement
> in the protocol spec.

*shrug*, it really depends.  If we're going to say that gamepads are
associated with focus, then they have to go in a _specific_ wl_seat:
the focus is per-seat, so if we're saying that clicking on a window
with your mouse to focus it (or Alt-Tabbing to it) also redirects
gamepad events there, then the gamepad needs to be part of _that_ seat
which changed the focus.  Remember that we can have multiple focii per
the protocol (though the UI for that gets very interesting very
quickly).

If they're unaffected by the focus - which they would be if they're
just going into random new wl_seats - then they shouldn't be in
wl_seat just because it's the container we have for input devices
right now, they should have their own interfaces.  Which really means
wl_gamepad_manager which, when bound to, advertises new_id
wl_gamepads.

tl;dr: wl_seat has a very specific meaning of a set of devices with
one focus, please don't abuse it.

>> The gamepad interaction doesn't have this luxury, and neither do
>> tablets.  I don't think splitting them out to separate seats is the
>> right idea though: what if (incoming stupid hypothetical alert) you
>> had four people on a single system, each with their own keyboards and
>> gamepads.  Kind of like consoles are today, really.  Ideally, you'd
>> want an association between the keyboards and gamepads, which would be
>> impossible if every gamepad had one separate wl_seat whose sole job
>> was to nest it.
>
> So... what's wrong in putting each keyboard into the wl_seat where it
> belongs, along with the gamepad?

In that case, yes, we would have wl_seats with one wl_keyboard and
multiple wl_gamepads.

>> I think it'd be better to, instead of wl_seat::get_gamepad returning a
>> single new_id wl_gamepad, as wl_pointer/etc do it today, have
>> wl_seat::get_gamepads, which would send one wl_seat::gamepad event
>> with a new_id wl_gamepad, for every gamepad which was there or
>> subsequently added.  That way we keep the seat association, but can
>> still deal with every gamepad individually.
>
> It would be left for the client to decide which gamepad it wants from
> which wl_seat, right?
>
> Do we want to force all clients to choose every non-aggregatable device
> this way?

No idea. :)

> Essentially, that would mean that wl_seat are just for the traditional
> keyboard & mouse (and touchscreen so far) association, and then
> everything else would be left for each client to assign to different
> wl_seats on their own. This seems strange. Why do we need a wl_seat
> then, why not do the same with keyboards and mice?
>
> Oh right, focus. You want to be able to control keyboard focus with a
> pointer. Why is a gamepad focus different? Would all gamepads follow
> the keyboard focus? If there are several wl_seats with kbd & ptr, which
> keyboard focus do they follow? What if the same gamepad is left active
> in more than one wl_seat? What if there is no keyboard or pointer, e.g.
> you had only a touchscreen and two gamepads (say, IVI)?
>
> And then replace every "gamepad" with "digitizer", and all other
> non-aggregatable input devices, and also all raw input devices via
> evdev fd passing. The fd passing I believe has similar problems: who
> gets the events, which wl_seat do they follow.
>
> This is a new situation, and so many open questions... I just continued
> on the exitisting pattern.

Yeah, it really all depends on these questions.  But intuitively, I'd
say that gamepads should follow a seat's focus, which means expanding
wl_seat to be able to advertise multiple gamepads.  Even on touch, we
still have wl_touch as part of wl_seat, driving the focus.  And I
don't think a gamepad could ever be a part of multiple seats; perhaps
it could be shifted between seats if necessary, but this is a problem
we already have with keyboard, pointer and touch today.  And you don't
need to deal with that in the protocol: just have the compositor
destroy the device and create a new one in the new seat.

Cheers,
Daniel
___
wayland-devel mail

Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 12:06 PM, Daniel Stone  wrote:

> I think with 6DoF-type devices, we really shouldn't try to do anything
> clever with them, and pretty much just pass evdev input through.  The
> only reason we created wl_pointer and wl_keyboard as they are is that
> the compositor needs to interpret and intercept them, and clients
> would all be doing more or less the same interpretation too.  For
> complex devices where it's of no benefit to have the compositor
> rewrite the events, I think we just shouldn't even try.
>
> If the gamepad proposal was any more complex than it is now, I'd lean
> towards just shuttling the raw data to clients rather than having our
> own protocol.  But the proposal I've seen is pretty nice and it
> definitely helps our gaming story (which is really quite poor now), so
> that helps.
>
> The one thing I think it's missing so far is physical controller gyro
> measurements, e.g. for new PS3/PS4 controllers and the Wiimote.

In my experience, the accelerometers on these devices are all over
the map in terms of precision and accuracy; if you have a MotionPlus
and a Nunchuck controller plugged into your Wiimote, IIRC you have
three different 3-axis accelerometers active, each with a different
resolution.  The weakest is in the nunchuck, the wiimote one is
better, and the motion plus one is better still.

Between manufacturers, I'm not even sure if the accelerometer
coordinate system is the same *handed*, let alone what the reference
orientation is.

In my experience, the 3vec coming out of the accelerometer is
pretty close to normal, at least in the cases I've seen.  Not always,
obviously; if the player is swinging the thing around, it may be
wildly out of normal, but if you put the thing on the table and leave
it, then (ignoring the wicked instability the accelerometers seem to
have) it gives you something approaching a unit vector.  Which means
(once again) that the 24.8 fixed format really isn't suitable,

It also starts to lead to questions about things like the Move
controller, the Kinect, the Leap Motion, the sensors in the Oculus
Rift and that crazy thing from razer that dumps a torrent of quats at
you.  Unless (as you say) you're going to get into self-describing
protocols, the axe has to come down somewhere.

So, tackling accelerometers as a protocol is a bit of an
interesting balancing act, much harder (or at least, with more
potentially annoying decisions about tradeoffs) than the gamepad one.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Todd Showalter
On Fri, May 3, 2013 at 12:12 PM, Daniel Stone  wrote:

>> I think edge resistance/edge snapping really wants pointer warping as 
>> well.
>
> It's really difficult to achieve a nicely responsive and fluid UI
> (i.e. doing this without jumps) when you're just warping the pointer.
> To be honest, I'd prefer to see an interface where, upon a click, you
> could set an acceleration (deceleration) factor which was valid for
> the duration of that click/drag only.  We already have drag & drop
> working kind of like this, so it's totally possible to do for relative
> (i.e. wl_pointer) devices.  The only two usecases I've seen come up
> for pointer warping are this and pointer confinement, which I'd rather
> do specifically than through warping - which is a massive minefield I
> really, really want to avoid.

Decelerate/accelerate would cover all the cases I can think of.

> But it's also a totally orthogonal discussion. :)

True enough.  :)

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Rick Yorgason
Pekka Paalanen  writes:
> > > Maybe there could be some scheme, where we would not need to have the
> > > wl_seat<->player mapping configurable in games after all, if one goes
> > > with server side heuristics. There are also the things Daniel wrote
> > > about, which link directly to what we can do.
> > 
> > I vote do it on the server, however it winds up being done.  It
> > means the client is isolated from a whole bunch of things it would
> > otherwise need to explicitly support, and it means that things happen
> > consistently between games.  It also means that any bugs in the
> > process will be addressable without shipping a new build of the game.
> 
> Cool, I agree with that. 

In console-land, all three major consoles allow you to forcibly change your
controller numbers by pressing or holding the home button and choosing some
option to reconfigure your controller, so there's certainly good precedent
for it being handled by the OS.

Also worth remembering is that all three major consoles have controller
number LEDs from 1 to 4. It would be nice if we could assume that the
controller indicator LED matched the pad_index whenever possible.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Rick Yorgason
Daniel Stone and Pekka Paalanen wrote:
> ...a bunch of stuff about per-player keyboards and wl_seats...

Okay, let's go over some typical situations:

* It's common for controllers to have keyboard and/or headset attachments,
and built-in touch screens are becoming more common. These are clearly
intended to be associated with the gamepad, rather than the computer.

* Advanced users may want to emulate this connection by plugging extra
devices straight into their computer instead of into their gamepad, and
would need some way to associate those devices with the gamepad.

* A gamepad keyboard may be used in a different way than the system
keyboard. For instance, you could have four local players playing
split-screen against a bunch of other players online. Each player should be
able to use their own keyboard attachment to send their own chat messages.

So perhaps all HIDs should have a pad_index (or player_index?). Anything
plugged directly into a controller will get the same pad_index as the
controller, but an advanced configuration screen could allow you to force a
certain pad_index for each device.

Any app which is not controller-aware can blissfully ignore the pad_index,
in which case they'll treat the keyboards or touch screens as aggregate devices.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Bill Spitzak

Todd Showalter wrote:


Decelerate/accelerate would cover all the cases I can think of.


I thought you said the speed of mouse movement controlled whether it 
slowed down or not. Ie if the user quickly dragged the slider to the 
bottom then the scrollbar was at the bottom, but if they moved slowly 
then it moved by lines. So it is not just "slow the mouse down" but 
"slow the mouse down only if it is being moved this speed".


For the proposed scrollbar the amount of deceleration depends on the 
document size, so the client has to control it. And an extremely large 
document is going to give you problems by making the mouse movements 
approach the fixed point fraction size. If this is avoided by returning 
the mouse movements in full-size pieces like the pointer-lock does then 
they have to be marked as to whether they are decelerated or not.


If the threshold between fast/slow is going to be built into the server, 
I really recommend some other annoying things be built into the server 
such as decisions about whether something is double-click and whether 
the user holding the mouse down and moving it a bit is "clicking" or 
"dragging", and at what point the user not holding the mouse down but 
moving it tiny bits is "hovering". Inconsistency between applications in 
these is incredibly annoying to the user.


The server is also going to have to pointer-warp back the cursor when it 
receives the decelerate request, and serial numbers have to be used so 
the client can ignore the pre-decelerate mouse movements.


All in all I see this as being enormously more complex than pointer warp.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-03 Thread Rick Yorgason
Pekka Paalanen  writes:
> Uh oh, yuk...
> 
> I wonder if one would have serious trouble achieving the same on
> Wayland. X is so much more liberal on what one can do wrt. protocol and
> the C API. For instance, in X I believe one can query a lot of stuff
> from the server, in Wayland nothing. In X a window reference is just an
> integer, and if you get something wrong, I think you get an error that
> you can choose to handle non-fatally. In Wayland, you have a pointer,
> that means you are susceptible to use-after-free and segfaults, and if
> you do something wrong, the server disconnects the whole client on the
> spot.

That would be a problem. Steam was designed so that the content creators
wouldn't have to recompile their games for the distribution platform, which
is why they always do the overlay with hooking.

We're obviously stepping outside of the original topic here, but it's
probably worth looking into whether embedded compositors are up to the task
here, or whether some new hooking extension will be required.

The requirements would be:

* The game should "feel" like any other game, in that task-switching, window
decorations, etc, should not be affected.

* If the game has its own launcher (in Linux it would typically be a Qt or
GTK app that spawns the main game) that should appear to be completely
unaffected.

* Any OpenGL window created by the launched app or any of its spawned apps
needs to be able to draw the overlay and intercept input.

* This should be done with minimal performance overhead.

There are other applications that use similar functionality. One common one
on Windows is called Fraps, which is for recording games. It overlays an FPS
counter, and intercepts the keyboard to turn recording on/off. There's a
Linux clone called "Faps", which I haven't used.

Another one in Windows-land is Afterburner, which is used by overclockers.
It has an option to show GPU temperature, clock speed, etc, in an overlay
over your game.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-05 Thread Pekka Paalanen
On Fri, 3 May 2013 17:42:20 +0100
Daniel Stone  wrote:

> Hi,
> 
> On 3 May 2013 08:17, Pekka Paalanen  wrote:
> > On Thu, 2 May 2013 19:28:41 +0100
> > Daniel Stone  wrote:
> >> There's one crucial difference though, and one that's going to come up
> >> when we address graphics tablets / digitisers too.  wl_pointer works
> >> as a single interface because no matter how many mice are present, you
> >> can aggregate them together and come up with a sensible result: they
> >> all move the sprite to one location.  wl_touch fudges around this by
> >> essentially asserting that not only will you generally only have one
> >> direct touchscreen, but it provides for multiple touches, so you can
> >> pretend one touch each on multiple screens, are multiple touches on a
> >> single screen.
> >
> > Right. Could we just say that each such non-aggregatable device must be
> > put into a wl_seat that does not already have such a device?
> > Or make that an implementors guideline rather than a hard requirement
> > in the protocol spec.
> 
> *shrug*, it really depends.  If we're going to say that gamepads are
> associated with focus, then they have to go in a _specific_ wl_seat:
> the focus is per-seat, so if we're saying that clicking on a window
> with your mouse to focus it (or Alt-Tabbing to it) also redirects
> gamepad events there, then the gamepad needs to be part of _that_ seat
> which changed the focus.  Remember that we can have multiple focii per
> the protocol (though the UI for that gets very interesting very
> quickly).
> 
> If they're unaffected by the focus - which they would be if they're
> just going into random new wl_seats - then they shouldn't be in
> wl_seat just because it's the container we have for input devices
> right now, they should have their own interfaces.  Which really means
> wl_gamepad_manager which, when bound to, advertises new_id
> wl_gamepads.
> 
> tl;dr: wl_seat has a very specific meaning of a set of devices with
> one focus, please don't abuse it.

I'm not too clear on what it is.

In a wl_seat, we have one kbd focus, and one pointer focus. These
two are unrelated, except sometimes some pointer action may change
the kbd focus. Most of the time, they have no relation.

I was thinking of adding a third one: the gamepad focus. It could
be independent from kbd and pointer foci, or maybe it is assigned
with the kbd focus. Or maybe the gamepad focus is assigned to the
surface having any wl_seat's the kbd focus, whose client has bound
to the gamepad.

In any case, we have the fundamental problem: which client gets the
gamepad events at a point in time?

There can be several clients bound to any gamepad, and the target
(focus) must be switchable intuitively.

Is it wrong to think a wl_seat as a user--a player, that may have a
gamepad?

It's just too tempting for me to think that each player corresponds
to a particular wl_seat.

> >> The gamepad interaction doesn't have this luxury, and neither do
> >> tablets.  I don't think splitting them out to separate seats is the
> >> right idea though: what if (incoming stupid hypothetical alert) you
> >> had four people on a single system, each with their own keyboards and
> >> gamepads.  Kind of like consoles are today, really.  Ideally, you'd
> >> want an association between the keyboards and gamepads, which would be
> >> impossible if every gamepad had one separate wl_seat whose sole job
> >> was to nest it.
> >
> > So... what's wrong in putting each keyboard into the wl_seat where it
> > belongs, along with the gamepad?
> 
> In that case, yes, we would have wl_seats with one wl_keyboard and
> multiple wl_gamepads.
> 
> >> I think it'd be better to, instead of wl_seat::get_gamepad returning a
> >> single new_id wl_gamepad, as wl_pointer/etc do it today, have
> >> wl_seat::get_gamepads, which would send one wl_seat::gamepad event
> >> with a new_id wl_gamepad, for every gamepad which was there or
> >> subsequently added.  That way we keep the seat association, but can
> >> still deal with every gamepad individually.
> >
> > It would be left for the client to decide which gamepad it wants from
> > which wl_seat, right?
> >
> > Do we want to force all clients to choose every non-aggregatable device
> > this way?
> 
> No idea. :)
> 
> > Essentially, that would mean that wl_seat are just for the traditional
> > keyboard & mouse (and touchscreen so far) association, and then
> > everything else would be left for each client to assign to different
> > wl_seats on their own. This seems strange. Why do we need a wl_seat
> > then, why not do the same with keyboards and mice?
> >
> > Oh right, focus. You want to be able to control keyboard focus with a
> > pointer. Why is a gamepad focus different? Would all gamepads follow
> > the keyboard focus? If there are several wl_seats with kbd & ptr, which
> > keyboard focus do they follow? What if the same gamepad is left active
> > in more than one wl_seat? What if there is no keyboard or pointer, e.g.
> > you ha

Re: Input and games.

2013-05-05 Thread Todd Showalter
On Sun, May 5, 2013 at 12:55 PM, Pekka Paalanen  wrote:

> In a wl_seat, we have one kbd focus, and one pointer focus. These
> two are unrelated, except sometimes some pointer action may change
> the kbd focus. Most of the time, they have no relation.

As a total aside, OSX has this and it drives me nuts.  Scrollwheel
focus follows the pointer, keyboard focus doesn't.  In practise what
that means is that whenever I'm on OSX I wind up closing the wrong
thing.  Example:

- running an irc client and firefox
- colleague sends an url, I click on it
- firefox brings up the url, I mouse over to it and scroll through
with the scroll wheel
- I'm done with the link, clover-w to close the tab, and it closes my
IRC session instead, because keyboard focus never left the irc window

I've had to use OSX for a couple of years now because of some iOS
projects we've been working on, and this still bites me at least once
a day.  It's *completely* counterintuitive GUI behaviour.

> I was thinking of adding a third one: the gamepad focus. It could
> be independent from kbd and pointer foci, or maybe it is assigned
> with the kbd focus. Or maybe the gamepad focus is assigned to the
> surface having any wl_seat's the kbd focus, whose client has bound
> to the gamepad.
>
> In any case, we have the fundamental problem: which client gets the
> gamepad events at a point in time?
>
> There can be several clients bound to any gamepad, and the target
> (focus) must be switchable intuitively.
>
> Is it wrong to think a wl_seat as a user--a player, that may have a
> gamepad?
>
> It's just too tempting for me to think that each player corresponds
> to a particular wl_seat.

I don't think there's any problem in principle with the gamepad
events being delivered to the same client that has keyboard focus.
The only annoying thing is if (in a multiplayer game) someone can
screw you by sending you a well-timed IM that pops up a window and
steals focus, but honestly I think that's more an argument against
focus stealing than it is for not attaching gamepad focus to keyboard
focus.

I don't see any reason why you couldn't have two (or N, for some
reasonable N) games running at the same time, using the same gamepad,
and only the program with focus sees gamepad events.  There are some
tricky cases, if the game wants to have multiple windows with no
containing root window, for example, but maybe that's one of those
"well, don't do that, then" cases.

Having given it some thought, I'd be inclined to be cautious about
how much you consider the gamepad-with-builtin-keyboard case.  They
really made those things to make MMOs viable on game consoles.  As far
as I know, not a lot of people have them, and the main argument for
them is on consoles which don't have native keyboards.  On a PC, the
kinds of games that need keyboards are the kinds of games you tend to
want access to the mouse.  That's not to say that nobody will ever use
a gamepad keyboard in a game on Linux, but I'd argue it's on thin
enough ground that I wouldn't let it drive the design considerations.

  Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-05 Thread Pekka Paalanen
On Sun, 5 May 2013 15:27:54 -0400
Todd Showalter  wrote:

> On Sun, May 5, 2013 at 12:55 PM, Pekka Paalanen  wrote:
> 
> > I was thinking of adding a third one: the gamepad focus. It could
> > be independent from kbd and pointer foci, or maybe it is assigned
> > with the kbd focus. Or maybe the gamepad focus is assigned to the
> > surface having any wl_seat's the kbd focus, whose client has bound
> > to the gamepad.
> >
> > In any case, we have the fundamental problem: which client gets the
> > gamepad events at a point in time?
> >
> > There can be several clients bound to any gamepad, and the target
> > (focus) must be switchable intuitively.
> >
> > Is it wrong to think a wl_seat as a user--a player, that may have a
> > gamepad?
> >
> > It's just too tempting for me to think that each player corresponds
> > to a particular wl_seat.
> 
> I don't think there's any problem in principle with the gamepad
> events being delivered to the same client that has keyboard focus.
> The only annoying thing is if (in a multiplayer game) someone can
> screw you by sending you a well-timed IM that pops up a window and
> steals focus, but honestly I think that's more an argument against
> focus stealing than it is for not attaching gamepad focus to keyboard
> focus.

Focus stealing indeed, there has been some discussion about that.

The problem is, that a wl_seat may not have a keyboard, hence it does
not have a keyboard focus. And if there are multiple wl_seats, one for
each player, as a user I don't want to individually assign each player's
focus to the game.

> I don't see any reason why you couldn't have two (or N, for some
> reasonable N) games running at the same time, using the same gamepad,
> and only the program with focus sees gamepad events.  There are some
> tricky cases, if the game wants to have multiple windows with no
> containing root window, for example, but maybe that's one of those
> "well, don't do that, then" cases.

That shouldn't be a problem, since with the concept of focus, we have
input device events "enter" and "leave", which tell a client which
surface has the focus.

And this also affects the protocol design again. If we have the concept
of gamepad focus, we need the enter and leave events in some inteface.

> Having given it some thought, I'd be inclined to be cautious about
> how much you consider the gamepad-with-builtin-keyboard case.  They
> really made those things to make MMOs viable on game consoles.  As far
> as I know, not a lot of people have them, and the main argument for
> them is on consoles which don't have native keyboards.  On a PC, the
> kinds of games that need keyboards are the kinds of games you tend to
> want access to the mouse.  That's not to say that nobody will ever use
> a gamepad keyboard in a game on Linux, but I'd argue it's on thin
> enough ground that I wouldn't let it drive the design considerations.

I could imagine the Wii pointer exposed as a wl_pointer with the
gamepad... hrm, that's another curious input device that does not fit
well in our categories: it needs a cursor image, but provides absolute
positions unlike a mouse, right?


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-06 Thread David Herrmann
Hi Pekka

On Mon, May 6, 2013 at 8:54 AM, Pekka Paalanen  wrote:
> On Sun, 5 May 2013 15:27:54 -0400
> Todd Showalter  wrote:
>> Having given it some thought, I'd be inclined to be cautious about
>> how much you consider the gamepad-with-builtin-keyboard case.  They
>> really made those things to make MMOs viable on game consoles.  As far
>> as I know, not a lot of people have them, and the main argument for
>> them is on consoles which don't have native keyboards.  On a PC, the
>> kinds of games that need keyboards are the kinds of games you tend to
>> want access to the mouse.  That's not to say that nobody will ever use
>> a gamepad keyboard in a game on Linux, but I'd argue it's on thin
>> enough ground that I wouldn't let it drive the design considerations.
>
> I could imagine the Wii pointer exposed as a wl_pointer with the
> gamepad... hrm, that's another curious input device that does not fit
> well in our categories: it needs a cursor image, but provides absolute
> positions unlike a mouse, right?

I wrote the Wii-Remote kernel driver and I can assure you that there
is _no_ way you can handle this as a generic device. It is a good
example for a device that needs client support to make sense.
Of course, you can write a compositor input driver that emulates a
mouse via the raw Wii-Remote input, but then you should also advertise
it as a mouse to the clients. Depending on the method you use, it can
report absolute or relative events (accel: relative, IR: absolute)

I think we shouldn't try to support such special devices. It doesn't
make sense. Instead, provide generic devices via generic interfaces
(mouse, keyboard, touch, gamepad) but also provide a raw interface so
clients can have device-specific support. And if it turns out that a
new device-class emerges, we can always add support for them.

Regards
David
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-06 Thread Pekka Paalanen
On Mon, 6 May 2013 09:07:01 +0200
David Herrmann  wrote:

> Hi Pekka
> 
> On Mon, May 6, 2013 at 8:54 AM, Pekka Paalanen  wrote:
> > On Sun, 5 May 2013 15:27:54 -0400
> > Todd Showalter  wrote:
> >> Having given it some thought, I'd be inclined to be cautious about
> >> how much you consider the gamepad-with-builtin-keyboard case.  They
> >> really made those things to make MMOs viable on game consoles.  As far
> >> as I know, not a lot of people have them, and the main argument for
> >> them is on consoles which don't have native keyboards.  On a PC, the
> >> kinds of games that need keyboards are the kinds of games you tend to
> >> want access to the mouse.  That's not to say that nobody will ever use
> >> a gamepad keyboard in a game on Linux, but I'd argue it's on thin
> >> enough ground that I wouldn't let it drive the design considerations.
> >
> > I could imagine the Wii pointer exposed as a wl_pointer with the
> > gamepad... hrm, that's another curious input device that does not fit
> > well in our categories: it needs a cursor image, but provides absolute
> > positions unlike a mouse, right?
> 
> I wrote the Wii-Remote kernel driver and I can assure you that there
> is _no_ way you can handle this as a generic device. It is a good
> example for a device that needs client support to make sense.
> Of course, you can write a compositor input driver that emulates a
> mouse via the raw Wii-Remote input, but then you should also advertise
> it as a mouse to the clients. Depending on the method you use, it can
> report absolute or relative events (accel: relative, IR: absolute)
> 
> I think we shouldn't try to support such special devices. It doesn't
> make sense. Instead, provide generic devices via generic interfaces
> (mouse, keyboard, touch, gamepad) but also provide a raw interface so
> clients can have device-specific support. And if it turns out that a
> new device-class emerges, we can always add support for them.

Ok, so are we completely out of any input devices (standard input
interfaces like wl_keyboard or wl_pointer, really) we would really like
to associate with a gamepad?

If yes, that leaves only the gamepad focus issue, that ties into
wl_seats.


Thanks,
pq
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Input and games.

2013-05-06 Thread Daniel Stone
Hi,

On 5 May 2013 17:55, Pekka Paalanen  wrote:
> On Fri, 3 May 2013 17:42:20 +0100
> Daniel Stone  wrote:
>> tl;dr: wl_seat has a very specific meaning of a set of devices with
>> one focus, please don't abuse it.
>
> I'm not too clear on what it is.
>
> In a wl_seat, we have one kbd focus, and one pointer focus. These
> two are unrelated, except sometimes some pointer action may change
> the kbd focus. Most of the time, they have no relation.

'Most of the time'.  They are related though, occasionally affect each
other, and are not affected by the presence of, or actions from, other
seats.

> I was thinking of adding a third one: the gamepad focus. It could
> be independent from kbd and pointer foci, or maybe it is assigned
> with the kbd focus. Or maybe the gamepad focus is assigned to the
> surface having any wl_seat's the kbd focus, whose client has bound
> to the gamepad.
>
> In any case, we have the fundamental problem: which client gets the
> gamepad events at a point in time?
>
> There can be several clients bound to any gamepad, and the target
> (focus) must be switchable intuitively.
>
> Is it wrong to think a wl_seat as a user--a player, that may have a
> gamepad?
>
> It's just too tempting for me to think that each player corresponds
> to a particular wl_seat.

No, that's exactly right: one seat per user, hence the name (think
chair).  I don't think we need a separate gamepad focus; we could just
use the keyboard focus (or the pointer focus, if there's no keyboard,
or some magical compositor-specific metric if there's no pointer
either).

>> > This is a new situation, and so many open questions... I just continued
>> > on the exitisting pattern.
>>
>> Yeah, it really all depends on these questions.  But intuitively, I'd
>> say that gamepads should follow a seat's focus, which means expanding
>> wl_seat to be able to advertise multiple gamepads.  Even on touch, we
>> still have wl_touch as part of wl_seat, driving the focus.  And I
>> don't think a gamepad could ever be a part of multiple seats; perhaps
>> it could be shifted between seats if necessary, but this is a problem
>> we already have with keyboard, pointer and touch today.  And you don't
>> need to deal with that in the protocol: just have the compositor
>> destroy the device and create a new one in the new seat.
>
> Right, so if every wl_seat advertises every gamepad, how should the
> server route the gamepad input events? Should it be allowed to
> duplicate the events for every active wl_seat-binding of a gamepad?

I'm struggling to follow the leap between the previously quoted
paragraph (every player has likely one gamepad, every seat is exactly
one player), and this (every seat advertises every gamepad).  There is
definitely a need to be able to pin devices to a particular seat, and
move that pin, in the multiseat case, but that's no different to the
existing need to pin and move pointers and keyboards too.  And I
believe the answer is the same: that this protocol is unaffected, and
any moving mechanism would have to be a separate interface.

> Or, do we want to have a unique input event destination, and never
> duplicate events to several destinations?

We must never duplicate events to multiple destinations.  Imagine
pressing the start key and unpausing two games simultaneously.

> If every wl_seat advertises every gamepad, how do we avoid having a
> gamepad part of several wl_seats at the same time? When any client
> binds to any gamepad on any wl_seat, will the server then make
> that gamepad unavailable in all other wl_seats?
>
> We do not have this problem with keyboards and pointers, because
> keyboards and pointers are pre-assigned to wl_seats, and they do
> not change by client actions, nor do clients assume they can access
> all keyboards and pointers via the same seat. Unless, we think
> about having two separate pointers, and just one physical keyboard,
> and wanting to type for both pointers, one at a time. I don't
> recall any discussion of that, and I've just assumed it won't be
> implemented.

Well, a compositor could perfectly well implement that within the
protocol as it stands today.  But I wouldn't recommend it. :)

> **
>
> A very simple solution could be this (leveraging from your ideas):
>
> wl_gamepad_manager:
>
> A global interface, where you register wl_surfaces, and which
> advertises all gamepads in the session.
>
> wl_gamepad:
>
> Interface representing a single gamepad. Input events from the
> physical device are routed here, when any of the registered
> wl_surfaces have keyboard focus.
>
> There are some problems:
> - how to associate keyboards etc. with a player (gamepad)?
> - which seat's keyboard focus to follow?
> Seems that we have to involve the wl_seat somehow.
>
> Could this be a way forward?
>
> Or maybe wl_gamepad_manager is not a global itself, but when
> created, it gets bound to a specific wl_seat? Solves one problem,
> but introduces other ones...

For me, I'm thinking we add a gamepad

Re: Input and games.

2013-05-06 Thread Todd Showalter
On 2013-05-06, at 2:54 AM, Pekka Paalanen  wrote:

>>I don't think there's any problem in principle with the gamepad
>> events being delivered to the same client that has keyboard focus.
>> The only annoying thing is if (in a multiplayer game) someone can
>> screw you by sending you a well-timed IM that pops up a window and
>> steals focus, but honestly I think that's more an argument against
>> focus stealing than it is for not attaching gamepad focus to keyboard
>> focus.
> 
> Focus stealing indeed, there has been some discussion about that.
> 
> The problem is, that a wl_seat may not have a keyboard, hence it does
> not have a keyboard focus. And if there are multiple wl_seats, one for
> each player, as a user I don't want to individually assign each player's
> focus to the game.

That seems like an argument for ganging gamepads into a single seat, 
preferably one with a keyboard. I presume we want the normal case to be the 
easy case, and I think the normal case is one game running that has the focus 
of the keyboard, mouse and gamepads.

It's important to support other scenarios, but I think that is the one that 
has to Just Work with as little user effort as possible.

> I could imagine the Wii pointer exposed as a wl_pointer with the
> gamepad... hrm, that's another curious input device that does not fit
> well in our categories: it needs a cursor image, but provides absolute
> positions unlike a mouse, right?

Sort of. It's actually more complex in some ways, because the position 
actually comes from a camera in the end of the wiimote looking at a couple of 
infra red LEDs.  It also has accelerometers and hot-docking peripherals, and a 
built-in speaker.

It can lose sight of the "screen", at which point the pointer is in an 
undefined location. The accelerometers mean the pointer can have an 
orientation; you can in principle rotate the pointer.  It can change capability 
somewhat drastically depending on what is jacked in.

It's kind of neat, but if you're writing something that uses it, you're 
going to be writing a lot of device-specific code. I'm not sure how much of a 
useful abstraction can be built around it.

The other thing is, the pointer is driven by a camera looking at LEDs, but 
IIRC decoding that happens on the host machine; it just gets a stream of 
intensity pixmaps from the device and uses that to calculate position. Which 
means there are potentially a lot of interesting things you could do with it if 
you know a little signal processing and how to wire up LEDs.

Todd. 

--
  Todd Showalter, President
  Electron Jump Games, Inc.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Fwd: Input and games.

2013-04-29 Thread Bengt Richter

On 04/27/2013 03:05 AM Todd Showalter wrote:

I failed to reply-all before, so I'm forwarding this back to the list.

On Fri, Apr 26, 2013 at 5:46 PM, Jason Ekstrand  wrote:


My first general comment is about floating point.  I'm not 100% sure what
all went into the design decision to make wl_fixed have 8 bits of fractional
precision vs. 12 or 16.  I'm guessing that they wanted the increased integer
capability, but you'd have to ask Kristian about that.  My understanding is
that most game controllers work with ranges of [0,1] or [-1,1] which would
be wasteful to put into wl_fixed.  Looking below, it seems as if you're
fairly consistently picking a 16 bit fractional part.  That breaks out of
the norm of the wire format a bit, but I think it's justified in this case.
The big thing is to be consistent which it looks like you're doing anyway.


 In my experience, most game controllers actually return byte
values which you wind up interpreting either as signed or unsigned
depending on what makes sense.  Certainly that's been the case
historically.  In games we typically do something like:

stick.x = ((float)raw_x) / (raw_x>= 0) ? 127.0f : 128.0f;
stick.y = ((float)raw_y) / (raw_y>= 0) ? 127.0f : 128.0f;


I'm not an electronics or game equipment insider, but I wonder
why isn't the above

   stick.x = (((float)(raw_x+128)) / 127.5f) - 1.0f;
   stick.y = (((float)(raw_y+128)) / 127.5f) - 1.0f;

thus mapping [0,255] to [-1.0 to +1.0] symmetrically?

I think you might want to map a shaft encoder differently
though, since while a linear fence with 256 pickets make
a span of 255 spaces end to end, a circular fence of 256
pickets makes 256 spaces along the full circle.

I.e., if you wanted to map [-pi, +pi] to [-1.0, +1.0]
you'd just scale raw_angle times 1.0/128.0, given that
-128 represents -pi diametrically across from 0 and
coinciding with one step beyond 127 as +pi at +128.


Another concern is how to map [0, 255] onto [0, 2^15 - 1] cleanly.
Unfortunately, there is no good way to do this so that 0 ->  0 and 255 ->
2^15 - 1.  Perhaps that doesn't matter much for games since you're sensing
human movements which will be slightly different for each controller anyway.


 There is, actually:

expanded = (base<<  7) | (base>>  1);

 ie: repeat the bit pattern down into the lower bits.  Examples:

 ->  (000) | (111) ->  111
000 ->  () | (000) ->  000
100 ->  (1000) | (10) ->  1000100
1011001 ->  (1011001000) | (101100) ->  1011001101100

 And so forth.  It's the same scheme you use when doing color
channel expansion.  I haven't seen a rigorous mathematical proof that
it's correct, but I'd be surprised if someone moreo inclined than I
hasn't come up with one.


My take on an integer arithmetic interpretation in python, FWIW:

 >>> def expand(base): return ( base << 7 ) | ( base >> 1 )
 ...
 >>> def altexp(base): return base==255 and 32767 or base*2**15/255
 ...
 >>> set([expand(i)==altexp(i) for i in range(256)]) # show that all 
results are same
 set([True])

IOW, you could say expand is outputting the base/255 fraction of 2**15,
(not 2**15-1 BTW), evaluated by integer multiplying before unrounded integer
dividing, except where the fraction is 255/255.

If you do it in floating point and round up, you can map [0,255] to [0,2**15-1]
and not special case 255/255:

>>> def altex2(base): return int((base*(2.0**15-1.0)/255.)+.5)
...
>>> set([expand(i)-altex2(i) for i in range(256)])
set([0])

I guess that is closer to the real math explanation.

[...]

BTW, does input come in well behaved chunks, so that you get
a well aligned struct without having to assemble it yourself from a
stream of bytes as done somehow in showkey to group what comes
in a burst together from a function key press? Is it one record
at a time, or as many as are available and will fit in the
buffer you supply?

I.e., how hard would it be to write a showkey utility
for game inputs, with similar cooked and raw options ?

BTW2, does showkey itself work with wayland in all modes?

Regards,
Bengt Richter

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Fwd: Input and games.

2013-04-29 Thread Todd Showalter
On Mon, Apr 29, 2013 at 9:57 AM, Bengt Richter  wrote:
> On 04/27/2013 03:05 AM Todd Showalter wrote:

>> stick.x = ((float)raw_x) / (raw_x>= 0) ? 127.0f : 128.0f;
>> stick.y = ((float)raw_y) / (raw_y>= 0) ? 127.0f : 128.0f;
>>
> I'm not an electronics or game equipment insider, but I wonder
> why isn't the above
>
>stick.x = (((float)(raw_x+128)) / 127.5f) - 1.0f;
>stick.y = (((float)(raw_y+128)) / 127.5f) - 1.0f;
>
> thus mapping [0,255] to [-1.0 to +1.0] symmetrically?

You could do that too.  There are lots of ways to calculate the
range, and in practice you may well wind up feeding the output through
an easing function as well.  There's no advantage in being too
precise, however; while your function is symmetrical, it assumes
symmetrical input, which isn't a safe assumption.  My experience has
been that most of these things center at $80, at least in theory (Sega
hardware actually did settle precisely to $80, $80 in my experience),
but most hardware has some jitter around the center position.
Sometimes that gets hidden from you by the electronics, sometimes it
doesn't.

Many sticks also don't go to the full range even along the
encoding axis, and even those that do invariably don't encode the full
range along the diagonal; you wind up with a range of (very roughly)
-0.5(root 2) to 0.5(root 2).  They also usually have a lot of noise,
though that sometimes gets hidden from you by hysteresis built into
the electronics.  Practically speaking, a lot of gamepad joysticks
only really have about 5 bits of useful position data, and everything
below that is valid-looking noise.

So, if you could assume that the byte value you were getting
linearly and symmetrically encoded the analog range of the device,
then yes, your algorithm would be more correct.  I don't think you can
make that assumption, but that doesn't make your algorithm wrong, just
no more correct than the alternatives.

> I think you might want to map a shaft encoder differently
> though, since while a linear fence with 256 pickets make
> a span of 255 spaces end to end, a circular fence of 256
> pickets makes 256 spaces along the full circle.

I don't think we're mapping rotational controls anywhere, though;
the sticks are 2d vectors, and the triggers are 1d vectors.  The only
place you could get a circular input from that is from the angle of
stick deflection, and if the game wants that it can feed the stick
vector into atan2f().

> BTW, does input come in well behaved chunks, so that you get
> a well aligned struct without having to assemble it yourself from a
> stream of bytes as done somehow in showkey to group what comes
> in a burst together from a function key press? Is it one record
> at a time, or as many as are available and will fit in the
> buffer you supply?
>
> I.e., how hard would it be to write a showkey utility
> for game inputs, with similar cooked and raw options ?
>
> BTW2, does showkey itself work with wayland in all modes?

Are you asking about how the hardware behaves, or how the proposed
protocol will behave?

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Pekka Paalanen
On Mon, 6 May 2013 11:01:28 +0100
Daniel Stone  wrote:

> Hi,
> 
> On 5 May 2013 17:55, Pekka Paalanen  wrote:
> > On Fri, 3 May 2013 17:42:20 +0100
> > Daniel Stone  wrote:
> >> tl;dr: wl_seat has a very specific meaning of a set of devices with
> >> one focus, please don't abuse it.
> >
> > I'm not too clear on what it is.
> >
> > In a wl_seat, we have one kbd focus, and one pointer focus. These
> > two are unrelated, except sometimes some pointer action may change
> > the kbd focus. Most of the time, they have no relation.
> 
> 'Most of the time'.  They are related though, occasionally affect each
> other, and are not affected by the presence of, or actions from, other
> seats.
> 
> > I was thinking of adding a third one: the gamepad focus. It could
> > be independent from kbd and pointer foci, or maybe it is assigned
> > with the kbd focus. Or maybe the gamepad focus is assigned to the
> > surface having any wl_seat's the kbd focus, whose client has bound
> > to the gamepad.
> >
> > In any case, we have the fundamental problem: which client gets the
> > gamepad events at a point in time?
> >
> > There can be several clients bound to any gamepad, and the target
> > (focus) must be switchable intuitively.
> >
> > Is it wrong to think a wl_seat as a user--a player, that may have a
> > gamepad?
> >
> > It's just too tempting for me to think that each player corresponds
> > to a particular wl_seat.
> 
> No, that's exactly right: one seat per user, hence the name (think
> chair).  I don't think we need a separate gamepad focus; we could just
> use the keyboard focus (or the pointer focus, if there's no keyboard,
> or some magical compositor-specific metric if there's no pointer
> either).
> 
> >> > This is a new situation, and so many open questions... I just continued
> >> > on the exitisting pattern.
> >>
> >> Yeah, it really all depends on these questions.  But intuitively, I'd
> >> say that gamepads should follow a seat's focus, which means expanding
> >> wl_seat to be able to advertise multiple gamepads.  Even on touch, we
> >> still have wl_touch as part of wl_seat, driving the focus.  And I
> >> don't think a gamepad could ever be a part of multiple seats; perhaps
> >> it could be shifted between seats if necessary, but this is a problem
> >> we already have with keyboard, pointer and touch today.  And you don't
> >> need to deal with that in the protocol: just have the compositor
> >> destroy the device and create a new one in the new seat.
> >
> > Right, so if every wl_seat advertises every gamepad, how should the
> > server route the gamepad input events? Should it be allowed to
> > duplicate the events for every active wl_seat-binding of a gamepad?
> 
> I'm struggling to follow the leap between the previously quoted
> paragraph (every player has likely one gamepad, every seat is exactly
> one player), and this (every seat advertises every gamepad).  There is
> definitely a need to be able to pin devices to a particular seat, and
> move that pin, in the multiseat case, but that's no different to the
> existing need to pin and move pointers and keyboards too.  And I
> believe the answer is the same: that this protocol is unaffected, and
> any moving mechanism would have to be a separate interface.
> 
> > Or, do we want to have a unique input event destination, and never
> > duplicate events to several destinations?
> 
> We must never duplicate events to multiple destinations.  Imagine
> pressing the start key and unpausing two games simultaneously.
> 
> > If every wl_seat advertises every gamepad, how do we avoid having a
> > gamepad part of several wl_seats at the same time? When any client
> > binds to any gamepad on any wl_seat, will the server then make
> > that gamepad unavailable in all other wl_seats?
> >
> > We do not have this problem with keyboards and pointers, because
> > keyboards and pointers are pre-assigned to wl_seats, and they do
> > not change by client actions, nor do clients assume they can access
> > all keyboards and pointers via the same seat. Unless, we think
> > about having two separate pointers, and just one physical keyboard,
> > and wanting to type for both pointers, one at a time. I don't
> > recall any discussion of that, and I've just assumed it won't be
> > implemented.
> 
> Well, a compositor could perfectly well implement that within the
> protocol as it stands today.  But I wouldn't recommend it. :)
> 
> > **
> >
> > A very simple solution could be this (leveraging from your ideas):
> >
> > wl_gamepad_manager:
> >
> > A global interface, where you register wl_surfaces, and which
> > advertises all gamepads in the session.
> >
> > wl_gamepad:
> >
> > Interface representing a single gamepad. Input events from the
> > physical device are routed here, when any of the registered
> > wl_surfaces have keyboard focus.
> >
> > There are some problems:
> > - how to associate keyboards etc. with a player (gamepad)?
> > - which seat's keyboard focus to follow?
> > Seems that

Gamepad focus model (Re: Input and games.)

2013-05-08 Thread Martin Minarik
I like the second-class seat proposal. got the roughly
same idea
called it a  focus inherit seat.

It's kind of a child seat that is, for some reason, not
capable
to change it's own focus, instead it follows the mother
seat focus.

Nice thing is, the seat  id = player id, making the player
id redundant.

Another nice thing is, the application can still recognize
the seats, making
implementing sub compositor easier.

Most common scenario would be:
 A: joypad 1 on mother seat with keyboard and pointer
 B: joypad 2 on child seat

Let's say:
- user plugs another keyboard and pointer, udev assigns it
to B:
- weston promotes B: to mother seat
The situation is as follows:

 A: joypad 1 on mother seat with keyboard and pointer
 B: joypad 2 on mother seat with keyboard and pointer

The seats are now independent and can control the UI, user
2 can now quit
the focus on the fly. But the question is.. Is this the
expected?
Let's say there is a full screen application and the user
2 presses alt+tab, now
we have a surface stacking/ordering race.

But let's say it got assigned to A:

 A: joypad 1 on mother seat with 2x keyboard and pointer
 B: joypad2 on child seat

So everything is ok, therefore the device-seat assignment
policy would be
used this way to control the expected behavior.

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Todd Showalter
On Mon, May 6, 2013 at 8:36 AM, Pekka Paalanen  wrote:

> Into wl_seat, we should add a capability bit for gamepad. When the bit
> is set, a client can send wl_seat::get_gamepad_manager request, which
> creates a new wl_gamepad_manager object. (Do we actually need a
> capability bit?)

There are options here:

- have the capability bit, if the bit is set the client can request a
manager -- has to deal with the case where the client sent the request
but the caps bit wasn't set, presumably by returning NULL or -1 the
protocol equivalent

- leave out the caps bit, client requests the manager if they want it,
they get NULL equivalent if there are no gamepads

- leave out the caps bit, gampad manager is always there, but can be
expected to return 0 if asked to enumerate gamepads when none are
connected

> A wl_gamepad_manager will send an event for each physical gamepad (as
> it dynamically appears, if hotplugged later) associated with this
> particular wl_seat, creating a wl_gamepad object for each.
>
> A wl_gamepad object will send an event about the player id as the first
> thing, and also if it later changes.

Some gamepads don't have player id controls, so we can't rely on
them, but supporting them where we can is useful.  I think it's best
viewed as a really forceful hint as to the player's ID, where
otherwise we're stuck doing heuristics with plugging.

> If a gamepad is hot-unplugged, a wl_gamepad event will notify about
> that, and the wl_gamepad object becomes inert (does not send any
> events, ignores all but the destroy request).

Dealing gracefully with things like wireless gamepads running
their batteries flat or moving out of radio range is important, which
is what I assume this is to deal with.  I presume the idea here is
that if the player moves back into range or replaces the batteries,
the wl_gamepad object revives?

> Gamepad input events are delivered according to the keyboard focus of
> the related wl_seat. If there is no keyboard to focus, then use the
> pointer focus, or something. It doesn't really affect the protocol
> design how the focus is assigned. However, would we need a
> wl_gamepad::enter,leave events? Probably, along with events for initial
> state. Or maybe enter/leave should be wl_gamepad_manager events?

I think we need enter/leave events.  The client can be responsible
for cleaning up its own state, though if an initial state is sent on
focus gain that makes things much easier.

I don't see anything here that raises any flags for me; at least
at first reading it seems quite usable.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Daniel Stone
On 6 May 2013 14:48, Todd Showalter  wrote:
> On Mon, May 6, 2013 at 8:36 AM, Pekka Paalanen  wrote:
>> Into wl_seat, we should add a capability bit for gamepad. When the bit
>> is set, a client can send wl_seat::get_gamepad_manager request, which
>> creates a new wl_gamepad_manager object. (Do we actually need a
>> capability bit?)
>
> There are options here:
>
> - have the capability bit, if the bit is set the client can request a
> manager -- has to deal with the case where the client sent the request
> but the caps bit wasn't set, presumably by returning NULL or -1 the
> protocol equivalent
>
> - leave out the caps bit, client requests the manager if they want it,
> they get NULL equivalent if there are no gamepads

Wayland doesn't have a 'return NULL' facility: the client creates an
entry for the object, and a proxy, and then the server later
instantiates that object.  The only way to return NULL is an
asynchronous error which requires specialised handling.

> - leave out the caps bit, gampad manager is always there, but can be
> expected to return 0 if asked to enumerate gamepads when none are
> connected

Similarly, there's no round trip to enumerate gamepads.  The
wl_gamepad_manager would advertise new (to the client) gamepads, i.e.
when the client creates the gamepad manager, the manager immediately
advertises all existing devices, and then later advertises new devices
as they're added.

>> Gamepad input events are delivered according to the keyboard focus of
>> the related wl_seat. If there is no keyboard to focus, then use the
>> pointer focus, or something. It doesn't really affect the protocol
>> design how the focus is assigned. However, would we need a
>> wl_gamepad::enter,leave events? Probably, along with events for initial
>> state. Or maybe enter/leave should be wl_gamepad_manager events?
>
> I think we need enter/leave events.  The client can be responsible
> for cleaning up its own state, though if an initial state is sent on
> focus gain that makes things much easier.

Yeah, I think we definitely need enter/leave with current state.

Cheers,
Daniel
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Rick Yorgason
Pekka Paalanen  writes:
> This design allows several gamepads associated with one wl_seat, and
> thus one focus. It also allows gamepads to be assigned to different
> seats, but then we will have more problems on managing the foci, not
> unlike with keyboards. Hopefully there are no protocol design
> implications, though.
> 
> From the game's point of view, it will need to iterate over all
> wl_seats. For each seat with the gamepad capability bit set, create a
> wl_gamepad_manager, receive all wl_gamepad objects, and for each
> wl_gamepad receive the player id. Create your surfaces, wait for foci
> to arrive, and fire away.

This all sounds good, and I certainly wouldn't try to block it, but I do
have some thought experiments that lead to another solution.

Incoming thought experiments, from least to most theoretical:

Scenario 1) A typical single-user display server would likely only support
one wl_seat, and assign all wl_gamepads to that seat. Games can iterate over
them using wl_gamepad_manager. Great!

Scenario 2) Two users are using a multi-user display server. Each user has a
keyboard, mouse, and gamepad. User 2 has to set up their wl_seat using some
configuration window built into the display server, but once that's done
each user can jump in and out of a game as they see fit, assuming the game
was written to iterate over wl_seats properly. Neat!

Scenario 3) Two users are using gamepads with built-in touch screens (like
the PS4 and OUYA controllers). The players don't care about the ability to
have their own window focus, but the game wants to associate each touchpad
with its gamepad. Player 2 must set up their own wl_seat in the display
server, and must focus the game separately using the touchpad. Not ideal.

You might say, "yeah, it's a bit of extra effort to set up player 2, but the
cool thing is that player 2 can jump in and out of the game as they see
fit!" Only that's not necessarily true either. Since many games steal the
cursor, and player 2 has no keyboard, player 2 can jump into the game (using
the touchpad on the controller) but can't jump out.

Scenario 4) In this scenario we imagine that wl_seat has grown support for
wl_headset. Player 2 has an Xbox 360 controller with a headset plugged in.
To support this, player 2 must go through the display server's wl_seat
configuration to set up their own seat. Like in scenario 3, this is less
than ideal. And how does player 2 focus the game without a keyboard or mouse?

Now, here's my alternate suggestion:

* Drop wl_gamepad_manager.
* Give wl_seat the ability to share focus with the default seat.
* A typical display server would add each gamepad to a new seat that shares
focus with the default seat.
* Multiple wl_seats is *the* way to support multiple controllers.

Going through the scenarios again:

Scenario 1) A single-user display would assign each wl_gamepad to a unique
wl_seat that shares focus with the default seat. Games iterate through
wl_seats to get each player. Cool.

Scenario 2) On a multi-user display server, users can still use some
configuration screen to set up their own mouse/keyboard/gamepad/focus,
allowing them to jump in and out of the game as they see fit. As an added
bonus, any multi-controller game is *guaranteed* to work properly under this
setup, since the game programmer can't wrongly assume there's only one
wl_seat. Great!

Scenarios 3, 4) The display server can assume that devices attached to a
gamepad should go into the same seat as that gamepad, and don't need to have
their own focus. Player 2 is free to use their PS4/OUYA controller or
headset in a game without any extra configuration. Cool!

I don't know how much would get upset by allowing wl_seats to share focus,
but it seems to be more general and future-proof than the wl_gamepad_manager
solution.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Jason Ekstrand
On Mon, May 6, 2013 at 5:10 PM, Rick Yorgason  wrote:

> Pekka Paalanen  writes:
> > This design allows several gamepads associated with one wl_seat, and
> > thus one focus. It also allows gamepads to be assigned to different
> > seats, but then we will have more problems on managing the foci, not
> > unlike with keyboards. Hopefully there are no protocol design
> > implications, though.
> >
> > From the game's point of view, it will need to iterate over all
> > wl_seats. For each seat with the gamepad capability bit set, create a
> > wl_gamepad_manager, receive all wl_gamepad objects, and for each
> > wl_gamepad receive the player id. Create your surfaces, wait for foci
> > to arrive, and fire away.
>
> This all sounds good, and I certainly wouldn't try to block it, but I do
> have some thought experiments that lead to another solution.
>
> Incoming thought experiments, from least to most theoretical:
>
> Scenario 1) A typical single-user display server would likely only support
> one wl_seat, and assign all wl_gamepads to that seat. Games can iterate
> over
> them using wl_gamepad_manager. Great!
>
> Scenario 2) Two users are using a multi-user display server. Each user has
> a
> keyboard, mouse, and gamepad. User 2 has to set up their wl_seat using some
> configuration window built into the display server, but once that's done
> each user can jump in and out of a game as they see fit, assuming the game
> was written to iterate over wl_seats properly. Neat!
>
> Scenario 3) Two users are using gamepads with built-in touch screens (like
> the PS4 and OUYA controllers). The players don't care about the ability to
> have their own window focus, but the game wants to associate each touchpad
> with its gamepad. Player 2 must set up their own wl_seat in the display
> server, and must focus the game separately using the touchpad. Not ideal.
>
> You might say, "yeah, it's a bit of extra effort to set up player 2, but
> the
> cool thing is that player 2 can jump in and out of the game as they see
> fit!" Only that's not necessarily true either. Since many games steal the
> cursor, and player 2 has no keyboard, player 2 can jump into the game
> (using
> the touchpad on the controller) but can't jump out.
>

I think this is where a judicious use of the "home" button could come in
very handy.  As long as the game is nice and releases the mouse on "home",
or the compositor takes over or something, the problem is solved.


> Scenario 4) In this scenario we imagine that wl_seat has grown support for
> wl_headset. Player 2 has an Xbox 360 controller with a headset plugged in.
> To support this, player 2 must go through the display server's wl_seat
> configuration to set up their own seat. Like in scenario 3, this is less
> than ideal. And how does player 2 focus the game without a keyboard or
> mouse?

>

> Now, here's my alternate suggestion:
>
> * Drop wl_gamepad_manager.
>

Personally, I don't see why we need wl_gamepad_manager in order to have
multiple gamepads.  Why can't we have a get_gamepads request to which the
seat respons with a gamepad event.  Then again, the extra protocol object
isn't that bad either.


> * Give wl_seat the ability to share focus with the default seat.
>
* A typical display server would add each gamepad to a new seat that shares
> focus with the default seat.
>

I don't think we should restrict this to "default".  Effectively, we would
need a way for a seat to have a "parent" seat from which it derives its
focus.  This may get to be a mess especially from a user-interface
perspective.  Perhaps you could have a "automatically parent to whatever my
default is" paradigm.  However, for more interesting configs, it get a lot
more confusing.  See below.


> * Multiple wl_seats is *the* way to support multiple controllers.
>
> Going through the scenarios again:
>
> Scenario 1) A single-user display would assign each wl_gamepad to a unique
> wl_seat that shares focus with the default seat. Games iterate through
> wl_seats to get each player. Cool.
>
> Scenario 2) On a multi-user display server, users can still use some
> configuration screen to set up their own mouse/keyboard/gamepad/focus,
> allowing them to jump in and out of the game as they see fit. As an added
> bonus, any multi-controller game is *guaranteed* to work properly under
> this
> setup, since the game programmer can't wrongly assume there's only one
> wl_seat. Great!
>

Ok, let's bump this a step further and say that you have four controllers
and two mice/keyboards.  You want to have 2 gamepads for each mouse.  Now
we have a problem since only the "default" focus can have multiple gamepads
associated with it.  Like I said above, this can be solved by having a
parenting structure for seats.


> Scenarios 3, 4) The display server can assume that devices attached to a
> gamepad should go into the same seat as that gamepad, and don't need to
> have
> their own focus. Player 2 is free to use their PS4/OUYA controller or
> headset in a g

Re: Gamepad focus model (Re: Input and games.)

2013-05-06 Thread Rick Yorgason
Jason Ekstrand  writes:
>> Scenario 2) Two users are using a multi-user display server. Each user
>> has a keyboard, mouse, and gamepad. User 2 has to set up their wl_seat
>> using some
>> configuration window built into the display server, but once that's done
>> each user can jump in and out of a game as they see fit, assuming the
>> game was written to iterate over wl_seats properly. Neat!
>>
>> Scenario 3) Two users are using gamepads with built-in touch screens
>> (like the PS4 and OUYA controllers). The players don't care about the
>> ability to have their own window focus, but the game wants to associate
>> each touchpad with its gamepad. Player 2 must set up their own wl_seat
>> in the display server, and must focus the game separately using the
>> touchpad. Not ideal.
>>
>> You might say, "yeah, it's a bit of extra effort to set up player 2, but
>> the cool thing is that player 2 can jump in and out of the game as they
>> see fit!" Only that's not necessarily true either. Since many games
>> steal the cursor, and player 2 has no keyboard, player 2 can jump into
>> the game (using the touchpad on the controller) but can't jump out.

> I think this is where a judicious use of the "home" button could come in
> very handy.  As long as the game is nice and releases the mouse on
> "home", or the compositor takes over or something, the problem is solved.

Well, *a* problem is solved, but if you read carefully you'll see that's not
actually the problem the user was trying to solve. You're still saying
"Yeah, I know it's a pain to set up your controller, but look at this neat
thing you can do that you never really cared about!"

>> Scenario 4) In this scenario we imagine that wl_seat has grown support
>> for wl_headset. Player 2 has an Xbox 360 controller with a headset
>> plugged in. To support this, player 2 must go through the display
>> server's wl_seat configuration to set up their own seat. Like in
>> scenario 3, this is less than ideal. And how does player 2 focus the
>> game without a keyboard or mouse?

> I don't think we should restrict this to "default".  Effectively, we
> would need a way for a seat to have a "parent" seat from which it derives
> its focus.  This may get to be a mess especially from a user-interface
> perspective.  Perhaps you could have a "automatically parent to whatever
> my default is" paradigm.  However, for more interesting configs, it get a
> lot more confusing.  See below.

Yes, a parent seat is definitely more general. I was originally thinking of
something like that, but a use for it didn't immediately come to mind. The
two pairs of players scenario is interesting.

> Having the two controllers paired doesn't solve 3.  There are a lot of UI
> problems with having two pointers running around the screen that share a
> focus.  Let's say they're one of those crazy users that like sloppy-
> focus.  What happens when the two cursors are on different windows?  Does
> the primary cursor over-ride?  Does the secondary cursor work in windows
> that aren't focused?  Even if you have click-to-focus, you still have
> problems with the two cursors fighting.  What happens if they both try to
> do a drag-and-drop?  The only solution to this is probably to make the
> secondary pointer inert in windows that aren't focussed.  That said, I'm
> not sure if that 100% fixes it either.

I was thinking that a desktop environment would probably only have one
cursor. In other words, if any wl_seats are sharing focus, most apps would
treat them as aggregate, and it would only be more specialized apps (like
games) that might want to separate them.

> To tie up my comments, I think this can get far too complicated fast.  I
> think it's better to have a seat correspond to a focus and allow for the
> fact that we may have more than one person to a seat.  It's not 100%
> ideal, but it's a lot simpler than some strange tree of seats.  Besides,
> if some compositor wants to do a strange tree of seats, there's nothing
> in pq's proposed protocol that would prevent that and games shouldn't
> care.

pq's proposed protocol is certainly "good enough". My two concerns are:

1) Does this make it a pain to give players more than one IO device, like
gamepad+touchscreen or gamepad+headset.
2) If multiple wl_seats are only used in exotic setups, then programmers are
unlikely to discover that they're not enumerating them properly.

Both of these concerns are somewhat exotic, and I'm personally unlikely to
run into these problems, so I'm not going to lose any sleep if they're not
addressed. It just seems like wl_seats would be a perfect abstraction for
multiple players if only they didn't have to be so tightly tied to focus.

-Rick-

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Pekka Paalanen
Hi Todd,

Daniel nicely replied to the most important comments, here are a few
more.

On Mon, 6 May 2013 09:48:47 -0400
Todd Showalter  wrote:

> On Mon, May 6, 2013 at 8:36 AM, Pekka Paalanen  wrote:
> 
> > Into wl_seat, we should add a capability bit for gamepad. When the bit
> > is set, a client can send wl_seat::get_gamepad_manager request, which
> > creates a new wl_gamepad_manager object. (Do we actually need a
> > capability bit?)
> 
> There are options here:
> 
> - have the capability bit, if the bit is set the client can request a
> manager -- has to deal with the case where the client sent the request
> but the caps bit wasn't set, presumably by returning NULL or -1 the
> protocol equivalent
> 
> - leave out the caps bit, client requests the manager if they want it,
> they get NULL equivalent if there are no gamepads
> 
> - leave out the caps bit, gampad manager is always there, but can be
> expected to return 0 if asked to enumerate gamepads when none are
> connected

Yeah, like Daniel said, there is no concept of a "return value".

When a client creates a new object, the server can only either agree,
or disconnect the client with a protocol error. Any other behaviour
requires specialized handling, and causes a roundtrip, where the client
must either wait for a reply before continuing, or risk having further
requests ignored without any obvious way to know what got ignored in
the end. Both cases are unacceptable.

When a client sends a request, that creates a new protocol object, then
from the client's point of view, the object is created on that instant,
before the request has even been submitted to the wire. This allows the
client to immediately send more requests on that new object, without
waiting for a roundtrip in between. The same works also in the reverse
direction, when the server creates protocol objects by sending events.

A major design principle in Wayland is to minimize roundtrips, as it
leads to better performance and lower overhead.

> > A wl_gamepad_manager will send an event for each physical gamepad (as
> > it dynamically appears, if hotplugged later) associated with this
> > particular wl_seat, creating a wl_gamepad object for each.
> >
> > A wl_gamepad object will send an event about the player id as the first
> > thing, and also if it later changes.
> 
> Some gamepads don't have player id controls, so we can't rely on
> them, but supporting them where we can is useful.  I think it's best
> viewed as a really forceful hint as to the player's ID, where
> otherwise we're stuck doing heuristics with plugging.

It's not about the gamepad capabilities at all. It's just an
assignment, configured in the server: this input device belongs to
player N.

> > If a gamepad is hot-unplugged, a wl_gamepad event will notify about
> > that, and the wl_gamepad object becomes inert (does not send any
> > events, ignores all but the destroy request).
> 
> Dealing gracefully with things like wireless gamepads running
> their batteries flat or moving out of radio range is important, which
> is what I assume this is to deal with.  I presume the idea here is
> that if the player moves back into range or replaces the batteries,
> the wl_gamepad object revives?

No, that's not what I had in mind. An inert protocol object is
permanently dead by definition, and is only waiting for destruction by
the client. This is just a convenience to avoid races between the
server and the client.

If the gamepad later comes back "online", it is like it was hotplugged
again: a new wl_gamepad object is sent, with the same player id as
before.

> > Gamepad input events are delivered according to the keyboard focus of
> > the related wl_seat. If there is no keyboard to focus, then use the
> > pointer focus, or something. It doesn't really affect the protocol
> > design how the focus is assigned. However, would we need a
> > wl_gamepad::enter,leave events? Probably, along with events for initial
> > state. Or maybe enter/leave should be wl_gamepad_manager events?
> 
> I think we need enter/leave events.  The client can be responsible
> for cleaning up its own state, though if an initial state is sent on
> focus gain that makes things much easier.

Yeah, the main point of the leave event is to say "you don't get any
more input events from this device, until it comes back", and it also
implies that the client should forget all temporary state of the
gamepad, like which buttons were down.

Immediately following an enter event, or in the enter event, a new set
of current state is sent. Notice, that this should not be done by
sending e.g. fake button-down events. We have a protocol design policy,
that input events from user actions are never manufactured.

> I don't see anything here that raises any flags for me; at least
> at first reading it seems quite usable.

Cool. There are lots of details to get right, but those are easier to
tune with some XML at hand.


Thanks,
pq
_

Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Todd Showalter
On Tue, May 7, 2013 at 3:23 AM, Pekka Paalanen  wrote:

> Yeah, like Daniel said, there is no concept of a "return value".
>
> When a client creates a new object, the server can only either agree,
> or disconnect the client with a protocol error. Any other behaviour
> requires specialized handling, and causes a roundtrip, where the client
> must either wait for a reply before continuing, or risk having further
> requests ignored without any obvious way to know what got ignored in
> the end. Both cases are unacceptable.

Ok.  I was assuming that cases where you had fundamental
capability change in the server (ie: input devices appearing or
disappearing) were rare and special enough to warrant a round trip.

> When a client sends a request, that creates a new protocol object, then
> from the client's point of view, the object is created on that instant,
> before the request has even been submitted to the wire. This allows the
> client to immediately send more requests on that new object, without
> waiting for a roundtrip in between. The same works also in the reverse
> direction, when the server creates protocol objects by sending events.
>
> A major design principle in Wayland is to minimize roundtrips, as it
> leads to better performance and lower overhead.

Fair enough.  We're talking about rare events here, so I wouldn't
have called it essential, but if that's an organizing principle of the
project then so be it.

> It's not about the gamepad capabilities at all. It's just an
> assignment, configured in the server: this input device belongs to
> player N.

The place where that becomes a problem is with controller
batteries.  As an example, I've got a PS3, and my wife uses it to
watch netflix (it's a streaming tv/movie service, for those who
haven't heard of it).  It uses the PS3 controller as a remote, to do
things like play/pause.

It's not uncommon for the battery in the controller to run flat
while she's watching.  I've got a second controller, and we typically
charge one while the other is in use, but fairly often the controller
she's using runs flat.  When that happens, we have a second charged
controller, but to use it we have to reboot the PS3, because without
rebooting it connects as Player 2, and netflix only listens to Player
1.  As far as I know there's no simple way to tell the gamepad to
reconnect as Player 1, short of rebooting the machine and rerunning
all the controller handshaking.

When a gamepad goes away and then it reappears or another appears,
it's *probably* the same player.  So what I'm thinking is that it
makes more sense to have the wl_gamepad go into a "disconnected"
state, and then reactivate when the next gamepad appears, rather than
creating a new wl_gamepad.

> If the gamepad later comes back "online", it is like it was hotplugged
> again: a new wl_gamepad object is sent, with the same player id as
> before.

This would work too.  The main thing is dealing well with the
single player case where the player is replacing a gamepad.  This
could be because:

- they wandered out of RF range when they were getting a drink
- they want to play the game with a different gamepad
- the gamepad they were using ran out of power and is now plugged in via usb
- the gamepad they were using ran out of power and is being replaced
with a charged gamepad
- someone tripped over the usb cord and yanked it out and then plugged
it back in

> Yeah, the main point of the leave event is to say "you don't get any
> more input events from this device, until it comes back", and it also
> implies that the client should forget all temporary state of the
> gamepad, like which buttons were down.

Yes.

> Immediately following an enter event, or in the enter event, a new set
> of current state is sent. Notice, that this should not be done by
> sending e.g. fake button-down events. We have a protocol design policy,
> that input events from user actions are never manufactured.

My temptation would actually be to say that when focus goes to a
new application, we treat buttons that are down as if they were up;
don't send a release when they are lifted.  So, if I'm holding down
SELECT when focus enters the client window and then release it, press
it and release it, the client sees the press and the second release,
but not the initial release.

That doesn't work with axis values, but if the client cares about
deltas it's going to have to clear them on focus change anyways, since
it has already been said that the protocol will not be sending deltas.
 If we were sending deltas we could make things a little cleaner in
some ways, but it does expand the protocol and I'm not sure it does so
usefully.

   Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Pekka Paalanen
On Tue, 7 May 2013 11:14:08 -0400
Todd Showalter  wrote:

> On Tue, May 7, 2013 at 3:23 AM, Pekka Paalanen  wrote:
> 
> > Yeah, like Daniel said, there is no concept of a "return value".
> >
> > When a client creates a new object, the server can only either agree,
> > or disconnect the client with a protocol error. Any other behaviour
> > requires specialized handling, and causes a roundtrip, where the client
> > must either wait for a reply before continuing, or risk having further
> > requests ignored without any obvious way to know what got ignored in
> > the end. Both cases are unacceptable.
> 
> Ok.  I was assuming that cases where you had fundamental
> capability change in the server (ie: input devices appearing or
> disappearing) were rare and special enough to warrant a round trip.
> 
> > When a client sends a request, that creates a new protocol object, then
> > from the client's point of view, the object is created on that instant,
> > before the request has even been submitted to the wire. This allows the
> > client to immediately send more requests on that new object, without
> > waiting for a roundtrip in between. The same works also in the reverse
> > direction, when the server creates protocol objects by sending events.
> >
> > A major design principle in Wayland is to minimize roundtrips, as it
> > leads to better performance and lower overhead.
> 
> Fair enough.  We're talking about rare events here, so I wouldn't
> have called it essential, but if that's an organizing principle of the
> project then so be it.

They can be rare, but they can also happen any time.

> > It's not about the gamepad capabilities at all. It's just an
> > assignment, configured in the server: this input device belongs to
> > player N.
> 
> The place where that becomes a problem is with controller
> batteries.  As an example, I've got a PS3, and my wife uses it to
> watch netflix (it's a streaming tv/movie service, for those who
> haven't heard of it).  It uses the PS3 controller as a remote, to do
> things like play/pause.
> 
> It's not uncommon for the battery in the controller to run flat
> while she's watching.  I've got a second controller, and we typically
> charge one while the other is in use, but fairly often the controller
> she's using runs flat.  When that happens, we have a second charged
> controller, but to use it we have to reboot the PS3, because without
> rebooting it connects as Player 2, and netflix only listens to Player
> 1.  As far as I know there's no simple way to tell the gamepad to
> reconnect as Player 1, short of rebooting the machine and rerunning
> all the controller handshaking.
> 
> When a gamepad goes away and then it reappears or another appears,
> it's *probably* the same player.  So what I'm thinking is that it
> makes more sense to have the wl_gamepad go into a "disconnected"
> state, and then reactivate when the next gamepad appears, rather than
> creating a new wl_gamepad.
> 
> > If the gamepad later comes back "online", it is like it was hotplugged
> > again: a new wl_gamepad object is sent, with the same player id as
> > before.
> 
> This would work too.  The main thing is dealing well with the
> single player case where the player is replacing a gamepad.  This
> could be because:
> 
> - they wandered out of RF range when they were getting a drink
> - they want to play the game with a different gamepad
> - the gamepad they were using ran out of power and is now plugged in via usb
> - the gamepad they were using ran out of power and is being replaced
> with a charged gamepad
> - someone tripped over the usb cord and yanked it out and then plugged
> it back in

Yeah, sure, and that's all just heuristics inside the server. The
server needs to make sure the player id becomes what the user
wants, even if one wl_gamepad object is deleted and another created.

The server will have the same problem even if it was supposed to
revive a dead wl_gamepad object, anyway.

The problem you described with PS3 should be solvable with the
mysterious gamepad configuration GUI I talked about before, somehow.

> > Yeah, the main point of the leave event is to say "you don't get any
> > more input events from this device, until it comes back", and it also
> > implies that the client should forget all temporary state of the
> > gamepad, like which buttons were down.
> 
> Yes.
> 
> > Immediately following an enter event, or in the enter event, a new set
> > of current state is sent. Notice, that this should not be done by
> > sending e.g. fake button-down events. We have a protocol design policy,
> > that input events from user actions are never manufactured.
> 
> My temptation would actually be to say that when focus goes to a
> new application, we treat buttons that are down as if they were up;
> don't send a release when they are lifted.  So, if I'm holding down
> SELECT when focus enters the client window and then release it, press
> it and release it, the clien

Re: Gamepad focus model (Re: Input and games.)

2013-05-07 Thread Todd Showalter
On Tue, May 7, 2013 at 1:02 PM, Pekka Paalanen  wrote:

>> This would work too.  The main thing is dealing well with the
>> single player case where the player is replacing a gamepad.  This
>> could be because:
>>
>> - they wandered out of RF range when they were getting a drink
>> - they want to play the game with a different gamepad
>> - the gamepad they were using ran out of power and is now plugged in via usb
>> - the gamepad they were using ran out of power and is being replaced
>> with a charged gamepad
>> - someone tripped over the usb cord and yanked it out and then plugged
>> it back in
>
> Yeah, sure, and that's all just heuristics inside the server. The
> server needs to make sure the player id becomes what the user
> wants, even if one wl_gamepad object is deleted and another created.

The client needs to look at a new wl_gamepad when it shows up and
decide whether it's a new player or an existing player who is
reconnecting,  As long as it's easy for the client to do that, I think
we're good.

> The problem you described with PS3 should be solvable with the
> mysterious gamepad configuration GUI I talked about before, somehow.

Partly, though I think the default case should be that if a
controller disappears and another (or the same one) appears, the
assumption is it's the player that just left coming back.  The number
of times that isn't true isn't likely to be statistically significant.

>> My temptation would actually be to say that when focus goes to a
>> new application, we treat buttons that are down as if they were up;
>> don't send a release when they are lifted.  So, if I'm holding down
>> SELECT when focus enters the client window and then release it, press
>> it and release it, the client sees the press and the second release,
>> but not the initial release.
>
> It depends. If a gamepad enters with button A down, and then the
> user presses button B down, is the application supposed to respond
> to B or A+B?

In my experience games that use gamepads don't usually use the
gamepad buttons as modifiers; it can happen, but it's awkward to
explain to the player and often awkward to actually perform with the
hands.  What you get more often is some sort of lockon, where holding
a button down makes player motion relative to a target (so you can
circle-strafe around an opponent, for example).  In cases like this
the focus switch is likely to have broken the player's context
anyways.

Todd.

--
 Todd Showalter, President,
 Electron Jump Games, Inc.
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


  1   2   >