Hi guys,

I finally have a sensor fusion algorithm which gets the values from an
accelerometer, a magnetometer and a gyroscope and computes correctly
the orientation of the sensors in respect to the world.
See
http://www.varesano.net/blog/fabio/initial-implementation-9-domdof-marg-imu-orientation-filter-adxl345-itg3200-and-hmc5843-a

I'm now able to get the values computed by the sensor array on Arduino
into Soya, using threads and pyserial.

Now, the sensor fusion algorithm gives me a quaternion representing
the orientation of the sensor array in respect to the world (eg: so
that the yaw angle computed is the angle between the X axis of the
sensor and the Hearth north and so on..) ..

Now, I'm stuck on how to convert this quaternion to something which I
could use in Soya. I also do have methods to correctly calculate yaw,
pitch and roll (respect the real world) but I've not been successful
using them in Soya.

The idea is to have a body in Soya which should move as the sensors
moves (just like in the processing application linked above) so I
guess I probably have to align the world frame to the Soya scene, an
hint on this would be appreciated.


Thank you,

Fabio Varesano

_______________________________________________
Soya-user mailing list
Soya-user@gna.org
https://mail.gna.org/listinfo/soya-user

Reply via email to