Thanks for your answers.
I understand the theory behind this, which means that I will need to set
projection matrices for the left and right eye cameras according to my distance
to the display surface. The problem is, can these cameras be retrieved and have
their projection matrices modified
Mel Av wrote:
Thanks for your answers.
I understand the theory behind this, which means that I will need to set projection matrices for the left and right eye cameras according to my distance to the display surface. The problem is, can these cameras be retrieved and have their projection
ted morris wrote:
yup, you will need to re-compute the proper frustum dynamically. To
do that, you will need to know the position/orientation of sensor
coordinate system is w.r.t. the 'window'.
Ah, I missed the part about the stereo projector. Yes, you will need to
adjust the projection
On Wed, Mar 3, 2010 at 10:56 AM, Mel Av melinos...@hotmail.com wrote:
Hey all,
Many thanks for your answers.
The Vicon system uses a z-up coordinate system, with millimeter units. It
sends x,y,z coordinates for a given tracked object(in that case the cap you
are wearing) as well as rotation
Hey,
I was wondering if anyone knows what would be the best solution for building an
application where the camera is headtracked using motion captured data provided
in realtime from a Vicon system in OpenSceneGraph. I can get the position and
rotation of the 'real' camera (cap with infrared
I did something similar a very long time ago with OSG.
I set the ModelView and Projection matrix of the SceneView directly:
setProjectionMatrix( _mtx_proj )
setViewMatrix(vmtx)
for model view you would calculate matrix::makeLookAt and for the projection
matrix
you need to compute your frustum.
ted morris wrote:
OSGers-- I'm talking *very old versions* of OSG-- is there now a
bundled 'convenience' class that
takes care of this monkey business?
I'd think all you should need to do is call setViewMatrix() on the
Camera node. The sticky part is usually that the tracking system's
ted morris wrote:
OSGers-- I'm talking *very old versions* of OSG-- is there now a
bundled 'convenience' class that
takes care of this monkey business?
I'd think all you should need to do is call setViewMatrix() on the
Camera node. The sticky part is usually that the tracking system's
Hey all,
Many thanks for your answers.
The Vicon system uses a z-up coordinate system, with millimeter units. It sends
x,y,z coordinates for a given tracked object(in that case the cap you are
wearing) as well as rotation information in axis/angle form. The client I'm
using converts this to
Hi, Mel,
Mel Av wrote:
1. You have to inverse the tr*rot result first
This probably comes from the fact that you're setting a View matrix,
which is essentially the inverse of a model matrix. The transform in
the OSG scene forms the Model matrix, which is combined with the view
matrix
10 matches
Mail list logo