Hey guys...

So I am still dealing with issue with my Match move setup, more in terms of
the MM itself over anything else.

But, I'm trying to think of shortcuts with the problem.

It is a head that I am match moving, and I would prefer to stay out of
maya.  And due to the angle of it, all the MM's I was getting from 3D I've
had to throw away.  So I am rebuilding this in Nuke now.

I have geo and a camera and all of that.

What I am trying to figure out is if there is a way to leverage each space
(2d, and 3D) to help knock out this task more effectivly.

I've used a points to 3d to set a better anchor point for the head object
in the 3d space. and have re-st my .obj there and set it's pivot to the
center of that point.
They I was adding a secondary transformGeo node to hand MM the head into
position.  But what I was thinking is that I can track a feature on that
object in 2D space that could help with at least 30% of the MM to take out
all the little jitter and things that will be hard for me to nail.
Can I use that 2D point track to affect the 3D transformGeo node?
Obviously the values are in two completely different worlds, but would
there be a way to use the 2D transform data combined with the camera data
to generate a value that was applicable to the 3D transformGeo node?  I'm
working on an undistorted plate so 90% of the lens issues should be removed
from this equation.

Thoughts?  I was thinking of having a reconcile3d run on every 2d point and
generate a transform node based on that.  But would that work?

Thanks guys!

Justin

-- 
Justin Ball VFX
VFX Supervisor, Comp, Effects, Pipeline and more...
jus...@justinballvfx.com
818.384.0923
_______________________________________________
Nuke-users mailing list
Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to