Hi Stu,

You see Markku....there are many blind people who use JAWS and would not have to if an accessible reader would be "inside" Oo 2.0.

I hate to keep pushing it, but accessibility for the blind and visually impaired IS an issue. the next upgrade must be accessible.

Stu ([EMAIL PROTECTED])

There is a long-standing debate about self-voicing apps vs. apps that are compatbile with assistive technologies. There are two components of this debate:


 1. What do users actually want/prefer?
 2. What is the most efficient and effective thing to do?

Not having a notable disability, I cannot provide a personal opinion on what users prefer. But I can share with you my summary of my anecdotal conversations with many users with a variety of disabilities over the course of the last 13 years that I've been in this field.

The general, overall preference is for applications that are (a) specifically tailored for their needs [e.g. self-voicing, self-Brailling, self-speech-recognizing] that are (b) the same app that everyone else uses or at least (c) have all the same features and never lags behind the apps that everyone else uses and (d) that don't cost anything more then the apps that everyone else uses. Turns out you can't have all of this in general, which leads many/most users to the assistive technology approach - make the apps everyone else uses work well with my AT. A strong and vocal minority of users prefer the parallel, special-designed-just-for-them approach (e.g. emacspeak).

Now, in theory with open source folks in the community could make a given application self-accessible (e.g. self-voicing, self-Brailling, self-voice-recognizing). You could even do this with some handful of "important" apps (for some value of important). But you quickly run into several problems:

 a. What are the "important apps"?
 b. Where do the resources come from to do all this work?
 c. What alternate user interaction modalities do you support (and
    thus what disability communities do you support)?

A self-voicing OOo would be great. It would allow those without disabilities, and those with blindness who could hear, to use it. There is of course the question of where we get a FOSS TTS engine to bundle into OOo. And this doesn't help deaf-blind. And this doesn't help those with mobility impairments needing single switch or head-mouse or eye-gaze technology. And this doesn't help those needing voice-recognition. And ... And if we try to address all of those needs, we have a massive combinatorics explosion - every disability need cross every important application.

And if we actually manage to to do a significant number of "important apps" for a significant number of disabilities, we run into a further problem: integration of the user experience. Which app controls the Braille display? How do we decide when some in-the-background-app should be able to send an important, interrupting speech message? What controls where my voice recognition commands and dictation go? These are all "solvable" problems - as my manager likes to say, you can do anything with enough time, money, and compute power. But it is a very large coordination job to solve it.


And that all leads us to the second question: what is the most efficient and effective thing to do? I believe it is to do essentially what we are already doing:


 a. define an accessibility architecture that provides all of the
    information that any AT might need (and make it extensible so you can
    add the things you miseed)
 b. implement that architecture not just on the "important apps", but
    build it into the GUI toolkits and frameworks so that as much as
    possible *all* apps built with those toolkits/frameworks will
    implement the accessibility architecture
 c. build a set of AT that use this architecture, and thereby provide
    access to the entire desktop and do so in a consistant fashion

In addition, we're doing virtually all of this work open source (mostly under the LGPL license), so that you'll be able to to have a free screen reader/magnifier working with your free office suite, your free web browser, y our free e-mail/calendaring application. Of course, to get this combination you need to be running on a free desktop (or on Solaris 10, which is "free as in beer" - no cost download for x86/x64 systems).


But as far as making OOo self-voicing for use on Windows instead of needing a recent copy of JAWS (which supports the accessibility architecture) - that isn't something anyone I know of is working on. Though, being open source, you are most welcome to make such a contribution!



Regards,

Peter Korn
Sun Accessibility team


----- Original Message ----- From: "Markku Yli-Pentila" <[EMAIL PROTECTED]>
To: <[email protected]>
Sent: Thursday, April 07, 2005 11:28 PM
Subject: Re: [ui-dev] Accessibility



Hello,
I am blind user who uses jaws 3.3.1 and who has at the moment no
possibility to upgrade and can use OO 1.0 somehow, but not OO 1.1 at all.
Menu shows up ok, but the text shows not at all. When I asked here about
the thing, it was explained to me, but at the same time I understood that
developers have no intention to do a thing to restore the  earlier state
or do something so that it would work. I want to use newest software, but
with my old jaws and windows 98 it is sometimes hard.
Friendly
Markku

--
--------------------------------------------------------------------------


Markku Yli-Pentila, ?
www: http://www.kauhajoki.fi/markkuyp/ 06-2311354 / 0400 819088

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]






--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]




--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to