Hi, all. Apologies for the length of this, but I'm sort of diving 
head-first into Android development along a pretty unconventional path, 
so I'm encountering lots of unusual issues along the way.

By way of background, I'm totally blind. While doing one of my 
semi-regular Google pokes to see if anything new was happening on the 
Android accessibility front, I discovered Donut's accessibility API. Now 
I'm flirting with the idea of writing an open source Android screen 
reader. Of course, most new developers likely take on a less challenging 
app, and I don't know how far I'll take this, but unless I (or someone 
else) do this, "Hello, world" is pretty useless. :)

First things first, are there any similar efforts out there? Would 
anyone be interested in working on something? I've seen 
http://slimvoice.googlecode.com, but in addition to there being no 
contact information to be found, they're going in several directions at 
once, and I'm just not sure how productive that will prove.

Anyhow, my development environment pretty much has to be console-based, 
since I've not found accessible Linux/GTK IDEs. I'm also into Scala 
rather than Java. To that end, I've downloaded the Android SDK and have 
set up the Android plugin for SBT, my build tool of choice. I can 
successfully start the emulator, and can shell in with adb shell and 
poke around. SBT seems to successfully communicate with the emulator, I 
can build an apk and successfully install it. This is pretty much where 
my knowledge breaks down, because the graphical aspect of the emulator 
is entirely inaccessible until I can get a working AccessibilityService 
speaking which, I gather, I can't do without tweaking the emulator. So, 
questions:

1. Should the emulator provide TTS via native sound hardware?

2. From 
http://developer.android.com/reference/android/accessibilityservice/AccessibilityService.html
 
I read:

The lifecycle of an accessibility service is managed exclusively by the 
system. Starting or stopping an accessibility service is triggered by an 
explicit user action through enabling or disabling it in the device 
settings.

This creates a chicken and egg problem, since I can't enable my app 
unless I have an accessible emulator, which I hope that my app can 
eventually provide. Is it possible to enable whatever lifecycle tweaks 
may be needed from the adb shell?

3. I think that my initial goal should be an AccessibilityService that 
captures events and dumps applicable data to stdout on a shell. What I 
can then do is play around in the emulator, see what events are 
generated, what data those events contain and slowly construct a 
meaningful text stream. This stream can then eventually be piped into 
the TTS service, but simple printlns would involve fewer moving parts 
and would help me get started much faster. This, however, requires the 
ability to start the service from the adb shell so I'd have my service's 
stdout in my terminal. Is this at all possible? I've discovered the am 
command, but it's not particularly googleable, and I can't figure out 
how it works. :)

Thanks a bunch.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to