Melinda Green wrote: > Dzonatas Sol wrote: >> Attached is the basics of what is needed to get the gesture tracker >> to listen to a port and react. For now, I've left out the XML server >> socket bits and just cut it down to minimal code. The XML server >> socket works, I just don't want to confuse that with the request to >> simply read a UDP socket and react to a lookatPoint or gesture. (The >> XML server socket is my criteria) >> >> GestureTracker::init( LLPumpIO* ) is the entry starter. >> >> The questions I have so far are: >> >> * is this correct way to setup and use mLookAt and >> mLookAt->setLookAt() ? Also, not sure yet if it can be init'd when >> mainLoop() starts or only after login. >> * I didn't find yet what attention type the head motion should have, >> but I picked free-look. If the position is updated, then the >> attention timer is reset or does it still time out? > > I suspect we should add a new attention type specifically for > head-tracking devices. Its default priority can be similar to > free-look (I.E. low). I'm thinking that it should be just below > free-look because it's inputs will be less "intentional" by the user. > (Conscious attention should take precedence over unconscious > attention.) I don't have the code in front of me but I expect the > timer will reset with each effective input. It should be fine that > these inputs come in almost continuously because assuming the priority > is low enough, then these inputs will be easily overruled by higher > priority attention inputs. > > -Melinda > Thanks for the good info!
I created a jira for the gesture tracker code: https://jira.secondlife.com/browse/VWR-13713 I added the C# program to send bytes to the UDP port, so now the reader/gesture-trigger/lookat code and the writer examples are there. _______________________________________________ Policies and (un)subscribe information available here: http://wiki.secondlife.com/wiki/SLDev Please read the policies before posting to keep unmoderated posting privileges
