GNOME Assistive Technologies need to be able to listen to key events
globally and have the possibility of consuming them. Example use
cases:

* Orca's presentation of navigation (Events not consumed)
  - Right Arrow: Speak character moved to (right of cursor)
  - Shift Right Arrow: Speak character selected (left of cursor)
  - Down Arrow: Speak next line
  - Etc.
* Orca's navigational commands (Events consumed)
  - H/Shift + H: Move amongst headings
  - NumPad 8: Speak the current line
  - NumPad 5: Speak the current word
  - NumPad 2: Speak the current character
  - Etc.

Current solution: The Orca screen reader calls AT-SPI2's
atspi_register_keystroke_listener(). AT-SPI2 then notifies Orca of key
events it receives from the toolkit implementation of ATK method
atk_add_key_event_listener(). Applications then have to wait for Orca
to consume the event or not. This requires two DBUS messages. Toolkit
authors want to abolish this. That's fine, *if* we have an
alternative. Do we?


--
Alejandro Piñeiro Iglesias (apinhe...@igalia.com)
_______________________________________________
wayland-devel mailing list
wayland-devel@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/wayland-devel

Reply via email to