I've done some windows desktop and windows PDA/phone development, but have 
no experience (yet) with android other than as a user.

I have a specific interest in something akin to remote desktop, where (when 
they are both connected, probably via wifi) one android device [slave] 
would show an image of the other [host] screen, and (multi)touch input 
would be directed to the host from the slave device, essentially allowing 
the slave to control the host remotely, real-time (while seeing the impact 
of that control).

The screen presentation could probably be handled pretty easily, same for 
the wifi connection- what I'm really interested in is the screen control 
options.

Does Android expose (at any level) the events for user input (both to 
capture slave input, and push that to the host)? Whether that be the raw 
input of multitouch, or some intermediate level of control? The basic 
question here is whether the input on one device can be used as direct 
input on a different device, without having to create control wrappers for 
each program, or (for one existing program I read about) having to change 
the default keyboard entry to a host program specifically to enable this 
type of functionality.

I'd welcome technical information (what can and can't be done in Android), 
conceptual solutions ("here is one way it might work"), and suggestions for 
better search terms so I can find existing posts on the subject. Also, if 
this isn't the ideal forum for this question, please feel free to gently 
point me in a better direction.

Thank you,
Keith

-- 
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en

Reply via email to