Re: [Accessibility] Need to be able to register for key events globally

2013-12-17 Thread Piñeiro
Hi everybody,

thanks for the answers, and sorry for the delay of my own one. I will
answer this thread to reply both [Accessibility] threads, as current
status on both are really similar.

On 12/12/2013 03:41 AM, Matthias Clasen wrote:

 As Bill says, input methods already have a private protocol for
 intercepting and processing input events on the server side, and a
 similar facility could be added to the private protocol for ATs

During these days, I have been doing some research in relation with
that, specifically about how third-party (not included on the
compositor) on-screen-keyboards could solve the same problem
accessibility technologies have. As far as I saw, maliit is one of the
third-party osk that did more work in relation with Wayland support.
Reading their wiki [1], some of their proposals include define new
wayland extensions. Reading what they have right now defined [2], it is
really similar to accessibility needs.

 . And again, having at-spi using that private protocol and then
 offering key snooping to everybody over dbus would negate an advantage
 of Wayland, so the user of the private protocol should be the actual
 AT, not some multiplexing intermediate layer.

Well, if we can consider an on-screen keyboard, or a screen reader
trusted clients, but we can't consider as a trusted client the daemon
providing accessibility services, then we have a problem. Probably on
at-spi2 itself. If the concern about exposing those wayland private
protocols are security, then we hit again the problem that at-spi2
allows to access to their feature to everyone. For example, you can use
at-spi2 to click any button on any application without any restriction
(several automatic tests frameworks are based on it). Probably this is
the time to solve that, in general. After all DBUS has security policy
support. Hopefully, it could be possible to add a mechanism so ATs would
need to authenticate in order to use at-spi2 services. Under that
assumption, I hope that at-spi2 could be considered a trusted client.

In any case, my conclusion from these two emails is that in order to get
those features we would need to define some wayland extensions, no
matters if is Orca or at-spi2 the one using it.

So with all that in mind, what about a plan like this:
  a) Start to write accessibility-specific wayland extensions
  a.1) Start to implement them on a compositors (first on Weston?)
  b) Meanwhile investigate if it is possible to add security policies on
at-spi2, so this can be the trusted client using those extensions.
  b.1) If that fails, Orca would be the trusted client using those
extensions

Thoughts?

[1] https://wiki.maliit.org/Wayland_Input_Method_System_Proposal
[2]
https://gitorious.org/maliit/maliit-framework/source/f562ad91ee14680aaafc6401428ef259a9e61598:weston-protocols/input-method.xml

-- 

Alejandro Piñeiro

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [Accessibility] Need to be able to register for key events globally

2013-12-11 Thread Piñeiro
On 12/11/2013 07:14 PM, Giulio Camuffo wrote:
 Not currently.
 I'm not sure what you mean with consumed. Do you mean that a not
 consumed event is still received by the client that has the focus
 while a consumed one is not?

Yes, means that. On the example with Orca (GNOME screen reader), if you
press NumPad8, let's say on a text editor, Orca speak the current line,
but the editor doesn't receive that key (so doesn't change the text
being edited).

 If so, i guess a protocol could be developed where the compositor
 sends an event with the key that is pressed and then the client can
 reply with a request to send it to the focused client too.

 Giulio

 2013/12/10 Piñeiro apinhe...@igalia.com:
 GNOME Assistive Technologies need to be able to listen to key events
 globally and have the possibility of consuming them. Example use
 cases:

 * Orca's presentation of navigation (Events not consumed)
   - Right Arrow: Speak character moved to (right of cursor)
   - Shift Right Arrow: Speak character selected (left of cursor)
   - Down Arrow: Speak next line
   - Etc.
 * Orca's navigational commands (Events consumed)
   - H/Shift + H: Move amongst headings
   - NumPad 8: Speak the current line
   - NumPad 5: Speak the current word
   - NumPad 2: Speak the current character
   - Etc.

 Current solution: The Orca screen reader calls AT-SPI2's
 atspi_register_keystroke_listener(). AT-SPI2 then notifies Orca of key
 events it receives from the toolkit implementation of ATK method
 atk_add_key_event_listener(). Applications then have to wait for Orca
 to consume the event or not. This requires two DBUS messages. Toolkit
 authors want to abolish this. That's fine, *if* we have an
 alternative. Do we?


 --
 Alejandro Piñeiro Iglesias (apinhe...@igalia.com)
 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel

-- 

Alejandro Piñeiro

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: [Accessibility] Need to be able to sythesize mouse events

2013-12-11 Thread Piñeiro

On 12/11/2013 07:09 PM, Giulio Camuffo wrote:
 Wayland doesn't have a way to inject mouse events currently. Some
 protocol must be written, which would be presumably implemented as a
 private protocol which only trusted clients can use, given the
 security implications.

I was guessing about that. While reading about Wayland, I found this
extension:
https://github.com/01org/wayland-fits

And as you can read on the description:

The first protocol is a interface for generating input events
on the server-side (e.g. mouse, keyboard, touch).


What I was not sure if that would the one to go for accessibility needs,
and also if implemented as a private protocol, how at-spi2 could use it.
Or in other words, how declare at-spi2 as a trusted client.


 Giulio

 2013/12/10 Piñeiro apinhe...@igalia.com:
 GNOME Assistive Technologies need to be able to synthesize mouse events. Use
 cases:
 * Perform a mouse button click for users who cannot use a physical mouse
 * Route the mouse pointer to an object or element to trigger its hover
 action

 The Orca screen reader currently provides these commands via calls to
 AT-SPI2's atspi_generate_mouse_event() which in turn calls
 spi_dec_x11_generate_mouse_event() [1]. The equivalent functionality is
 needed for Wayland environments.

 How are we going to accomplish this?

 [1]
 https://git.gnome.org/browse/at-spi2-core/tree/registryd/deviceeventcontroller-x11.c#n1404

 --
 Alejandro Piñeiro Iglesias (apinhe...@igalia.com)
 ___
 wayland-devel mailing list
 wayland-devel@lists.freedesktop.org
 http://lists.freedesktop.org/mailman/listinfo/wayland-devel

-- 

Alejandro Piñeiro

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


[Accessibility] Need to be able to sythesize mouse events

2013-12-10 Thread Piñeiro
GNOME Assistive Technologies need to be able to synthesize mouse events. 
Use cases:

* Perform a mouse button click for users who cannot use a physical mouse
* Route the mouse pointer to an object or element to trigger its hover 
action


The Orca screen reader currently provides these commands via calls to 
AT-SPI2's atspi_generate_mouse_event() which in turn calls 
spi_dec_x11_generate_mouse_event() [1]. The equivalent functionality is 
needed for Wayland environments.


How are we going to accomplish this?

[1] 
https://git.gnome.org/browse/at-spi2-core/tree/registryd/deviceeventcontroller-x11.c#n1404


--
Alejandro Piñeiro Iglesias (apinhe...@igalia.com)
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


[Accessibility] Need to be able to register for key events globally

2013-12-10 Thread Piñeiro

GNOME Assistive Technologies need to be able to listen to key events
globally and have the possibility of consuming them. Example use
cases:

* Orca's presentation of navigation (Events not consumed)
  - Right Arrow: Speak character moved to (right of cursor)
  - Shift Right Arrow: Speak character selected (left of cursor)
  - Down Arrow: Speak next line
  - Etc.
* Orca's navigational commands (Events consumed)
  - H/Shift + H: Move amongst headings
  - NumPad 8: Speak the current line
  - NumPad 5: Speak the current word
  - NumPad 2: Speak the current character
  - Etc.

Current solution: The Orca screen reader calls AT-SPI2's
atspi_register_keystroke_listener(). AT-SPI2 then notifies Orca of key
events it receives from the toolkit implementation of ATK method
atk_add_key_event_listener(). Applications then have to wait for Orca
to consume the event or not. This requires two DBUS messages. Toolkit
authors want to abolish this. That's fine, *if* we have an
alternative. Do we?


--
Alejandro Piñeiro Iglesias (apinhe...@igalia.com)
___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel


Re: Making Wayland accessible

2013-10-17 Thread Piñeiro
Hi Matthias,

thanks for starting the discussion. I will add some points below.

On 10/15/2013 10:05 PM, Matthias Clasen wrote:
 As part of the ongoing GNOME Wayland porting effort, the GNOME
 accessibility (a11y) team has been looking into what it would take to
 make our existing a11y tools (ATs) and infrastructure work under Wayland.

FWIW, most of the stuff we did on the recent Montreal Summit was related
with Wayland. You can find a summary on this wiki page:
https://wiki.gnome.org/Accessibility/Hackfests/Montreal2013


 Most a11y features will probably end up being implemented in the
 compositor:
  
 - input tweaks like slow keys or bounce keys or hover-to-click
 naturally fit in the event dispatching in the compositor

 - display changes like zoom or color adjustments are already handled
 in gnome-shell under X

 - the text protocol should be sufficient to make on-screen keyboards
 and similar tools  work

 The remaining AT of concern is orca, our screen reader, which does not
 easily fit into the compositor. Here are some examples of the kinds of
 things orca needs to do to support its users:

 - Identify the surface that is currently under the pointer (and the
 corresponding application that is registered as an accessible client)

FWIW2, there is a running conversation about that here:
https://mail.gnome.org/archives/gnome-accessibility-devel/2013-October/msg6.html


 - Warp the pointer to a certain position (see
 https://bugzilla.gnome.org/show_bug.cgi?id=70 for a description of
 how this is used)

Also tracking mouse position (More about that here,
https://bugzilla.gnome.org/show_bug.cgi?id=710012, although it doesn't
include a description about how it is used).


 - Filter out certain key events and use them for navigation purposes.
 This is currently done by capturing key events client-side and sending
 them out again via at-spi, which will probably continue to work, even
 if it is awkward and toolkit authors would really like to get rid of it

 All of these features violate the careful separation between clients
 that Wayland maintains, so that probably calls for some privileged
 interface for ATs.

Most of the people I asked (mostly after Wayland related presentations
on GUADEC and others) pointed to that direction.

 I would appreciate feedback and discussion on this.

 Has anybody else thought about these problems already ?

 Are there better ways to do these things under Wayland ?


BR

-- 

Alejandro Piñeiro

___
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel