On Tue, Mar 01, 2016 at 06:20:18PM AEDT, Michael Pozhidaev wrote:
> Hi Luke,
> 
> Luke Yelavich writes:
> 
> > There is much to be done to get this off the ground. Given that
> > Canonical is writing its own display server Mir, the first requirement
> > is to properly tie in the accessibility infrastructure, mainly at-spi,
> > into working properly with Mir to intercept input events, and extend
> > at-spi itself to support touch. It would then be a matter of extending
> > Orca to work with touch, and not requiring Gtk support. In addition,
> > Qt's own linux accessibility support would likely need much work,
> > particularly the QML accessibility components. Ubuntu's QML based SDk
> > would also need much accessibility work.
> 
> Yes, really a plenty of work to do. Could it be reasonable to try
> first getting some features for basic phone functions (just to have
> something for the beginning), and, after that try the work you have
> described?

It is possible, but I am not sure how one would be able to find what they
need on screen without some way of the touch inputs being intercepted,
to make sure you don't activate something which you did not mean to
activate. Simple notifications like who is calling, incoming messages
could be done, but the problem as I see it, is then trying to get to the
appropriate app to re-read such data.

> I know a bit about AT-SPI internals (Mike Gorse helped me a lot with
> that), and agree that this work is worthy enough to do, but will it let
> us get an accessible phone  in observable future? 

That all depends on the number of developers who are involved to make this
happen. Extending at-spi to support touch events will likely need to be
done at some point for Wayland, so that could certainly be started now by
those of us who are keen to get it done. The real challenge is then tying
it in with the Mir display server that Canonical is using on the phone
already, which would likely require help from the Mir developers for best
results. They already have a high level understanding of what is needed,
but getting this working is not a priority for them right now.

>From what they told me, they would probably want Mir itself to assume
control for input interception, rather than at-spi itself, since they
don't want external processes to have access to any of Mir's input event
management data.

> Is Ubuntu phone already on Mir or it still uses ordinary X.org? I am
> really impatient to try something (although don't have this phone
> yet). I am ready to try getting this phone, but I need to be sure
> that I will be able to have access to its internals. Otherwise,
> apparently , I will be unable to do anything at all.

As above, Mir is used on the phone now.

> I am planning to be in London from April 10th till 18th (and completely
> unaware when could get next chance to come to London). May I ask
> somebody to meet and let me know about this phone more please? If I get
> first understanding, I will purchase it for further experiments. I
> support everything what you have write and would be happy to participate
> in this development, but just want to have something to use as soon as
> possible. Just because I don't see nothing suitable arount instead. I
> don't trust Android, Tizen is completely unclear with its accessibility
> features, but Ubuntu phone is very inspirable also as a thing which I
> can improve myself.

Sorry, I am not sure I am able to help here with advice as to who you could 
contact in London to try and get more info/help with the Ubuntu phone platform.

Luke

-- 
Ubuntu-accessibility mailing list
Ubuntu-accessibility@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-accessibility

Reply via email to