Agreed, but I think their main focus was figuring out how to control the
device sans visual feedback rather than how to implement another text to
speech engine. So this was a quick hack way to get that out of the way.
CB
Jacob Schmude wrote:
Hi
The only problem with having pre-rendered speech on the iPhone is that
the device has way too many capabilities to be covered by any type of
pre-rendered speech. It may be able to pre-render the main menus and
the phone book, and your music library. Then what? Web browsing?
Email? Applications? Text messaging? None of that would work, and
that's what makes the iPhone worth having IMHO. Otherwise I'd just by
a new wm6 phone at half the price.
On Jan 26, 2009, at 14:47, Chris Blouch wrote:
This is a demo of somebody using an iPhone without any visual
feedback. From their paper they pre-rendered all the audio on a PC
ala iPod Nano and downloaded the snippets to the iPhone to be used by
their software.
Description:
As theuser drags their finger down the surface you can hear 'phone,
music, mail' and then they make a gesture with their finger to open
up the phonebook. The gesture is tapping with a second finger while
continuing to point to the object to take action on. Next they drag
down the surface and you hear it announce entries in the phone book.
Doing the same two-finger gesture calls the name just announced.
They flick straight up to go home. In mail they drag down the list
and then flick left to reply to the message. In music they drag down
to hear the artists and then drag right to hear songs under a
particular artist. They flick right to change tracks. Double-tap to
pause.
CB
David Poehlman wrote:
what is this a demo of?
----- Original Message ----- From: "Chris Blouch" <[email protected]>
To: "General discussions on all topics relating to the use of Mac OS
X by theblind" <[email protected]>
Sent: Monday, January 26, 2009 11:26 AM
Subject: Re: iPhone Accessibility
The Youtube demo is here:
http://www.youtube.com/watch?v=496IAx6_xys
There video is not described but basically the screen is blank and
everything they touch gets announced as their finger drags over the
iPhone's surface. Then a second finger is used to gesture that you want
to do something with thing just announced. Slick. Wonder when this will
move out of the labs.
CB
Victor Tsaran wrote:
Also, if you search for a university research project called
"Sliderule", you may just find something interesting. There is a also
a video demoing this technology on Youtube.
On 1/9/2009 10:01 AM, E.J. Zufelt wrote:
Not sure if you all are familiar with these three articles regarding
iPhone / cellular accessibility. These are courtesy of a post on
another
list.
"Here is something that I saw last month about a possible solution
for
the current iPhone.
http://www.engadget.com/2008/12/01/silicon-touch-an-iphone-case-for-the-visually-impaired/
Here is something that's been around for a while but doesn't seem to
have developed much.
http://www.engadget.com/2007/10/25/apple-envisions-tactility-on-touchscreen-keyboards/
http://www.engadget.com/2007/11/06/nokia-shows-off-haptikos-tactile-touch-screen-technology/
"
Thanks,
Everett
The major difference between a thing that might go wrong and a
thing that cannot possibly go wrong is that when a thing that cannot
possibly go wrong goes wrong it usually turns out to be impossible to
get at or repair.
--Douglas Adams