A little late to the party, but just wanted to add my 2 cents in case
it helps someone make a full dollar :)

@Thomas: the process Michael described above is exactly what you need
if your starting point is already a world position pass. You should
only need the pass rendered for one eye and the opposite camera to get
disparity data.

Unfortunately, this is a lot more tedious to do with standard nodes
than it would be by writing a dedicated plugin, specially if you want
to account for any possible variation with the Cameras.
For example, you can get the transformation matrix of your cameras
from their world_matrix knob, but you can't get the projection matrix
(unless you're writing a plugin, that is). So, you need to manually
figure out the camera-to-screen transformation using the knobs from
the camera's projection tab. For simple cases, you can use just the
focal and horizontal aperture values, but if you need to account for
window_translate, window_scale and roll changes, then it gets messy
very easily.

That said, I've put together a little Nuke script (attached) to go
from world position to disparity. It could be more compact, but it's
split out to show the different transforms and conversions between
coordinate systems one by one, so hopefully it'll be easier to
understand. Keep in mind that, as stated previously, this one doesn't
account for changes to the win_translate or win_scale knobs, though.

Hope that helps.

Cheers,
Ivan



On Sun, Apr 1, 2012 at 12:52 PM, Michael Habenicht <m...@tinitron.de> wrote:
> Hi Thomas,
>
> you are right the pworld pass is already the first part. We have the screen
> space and the coresponding world position. But to be able to calculate the
> disparity you need the screen space position for this particular point
> viewed through the second camera. It is possible to calculate this based on
> the camera for sure as this is what the scanline render node and reconcile3d
> node do. But don't ask me about the math.
>
> Best regards,
> Michael
>
> Am 01.04.2012 18:08, schrieb thoma:
>>
>> Hi Michael,
>>
>> We're using Arnold. If i have my stereo Pworld passes and stereo cameras
>> in nuke couldn't i make this work? When you say world position projected
>> to screen space isn't that essentially what a Ppass is or are you
>> talking about something more? I tried doing some logic operations on the
>> left and right eyes of the Ppass to see if it would yield anything
>> meaningful but no luck....
>>
>> Thanks
>> Thomas
>>
>>
>> _______________________________________________
>> Nuke-users mailing list
>> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
>> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Attachment: world_to_disparity.nk
Description: Binary data

_______________________________________________
Nuke-users mailing list
Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to