Regards,

I'm starting with Android and applications made in a job that I do is for 
the blind, I studied the code Eyes Free, but is quite complex for what is 
required. The idea is, I have several buttons on the screen and use your 
finger to blind to know what controls are passed over if one of them, using 
TTS, accessibility of services I think is called Touch by Explore. Can 
anyone help me with a simple example about this?

If Explore by Touch or touch exploration is turned on, then I can slide my 
finger over the screen to find out what is on it. Whenever my 
finger touches a button, block of text, edit field, etc., the screen 
reader lets me know. 

The idea is to make my own code, a simple example with two buttons.

Thank you very much

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to