No worries!
Glad it worked out for you.


On Mon, Sep 2, 2013 at 12:14 AM, Jed Smith <jedy...@gmail.com> wrote:

>  Ivan, thank you very much for your helpful wisdom! :)
>
> I think I finally intricately understand how uv maps work. With your
> awesome technique for inverting them, my gizmo seems to be working as
> expected, and with much better accuracy.
>
> It is updated here if anyone wants to take a look:
> https://gist.github.com/jedypod/6302723
>
> On Sunday, 2013-09-01 at 5:47p, Ivan Busquets wrote:
>
> Hi Jed,
>
> I believe the problem in your approach is in the assumption of how STMap
> works.
> STMap does a lookup for each pixel in the output image to find the
> coordinates in the input it needs to sample from to produce the final pixel
> value. In other words, it does not "push" pixels from the input into
> discrete output locations, but "pulls" pixels from the input for each pixel
> in the output. It's the difference between forward and backward warping.
>
> To do what you're trying to do you would effectively need a UV map that's
> the inverse of that distortion. You can get an approximation of such an
> inverse UV map by displacing the vertices of a card, which would be a way
> of forward-warping. The only caveat is that you'll need a card that has as
> many subdivisions/vertices as possible, since the distortion values will be
> interpolated between vertices. That's why it's only an approximation at
> best. But given enough subdivisions, it should get you close enough.
>
> Once you have that inverse UV map, your distorted XY coordinate should
> just be the UV value at your undistorted coordinate, multiplied by width
> and height. (script attached as an example).
>
> P.S. The other "minor" thing you might want to look into is the way you're
> generating your UV map. The expression you're using "x/width" and
> "y/height" will result in a UV map that displaces the image by half a pixel
> from scratch when fed into an STMap. STMap samples pixels from the input at
> their centre (x+0.5, y+0.5), so for a more accurate UV map you should use U
> = (x+0.5)/width and V = (y+0.5)/height.
>
> Hope that helps.
>
> Cheers,
> Ivan
>
>
>
>
>
>
>
> On Sat, Aug 31, 2013 at 8:18 PM, Jed Smith <jedy...@gmail.com> wrote:
>
>  Greetings!
>
> *The Problem*
> I am trying to write a tool to distort tracking data through a distortion
> map output by a LensDistortion node. I have everything working, except
> there seems to be inaccuracy in my method of calculating the distorted
> pixel position from the sampled values of the uv distortion map, when
> compared to a visual check.
>
> *My Method*
> Say there is a pixel value at 1792,476 in a 1080p frame. I have a standard
> UV Map, modified with a grade node through a mask, creating a localized
> distortion when this map is plugged into an STMap node. The distorted pixel
> value is 1821,484.
>
> The sampled uv map pixel values at the source pixel location is 0.916767,
> 0.432918 (for width, and height offset, respectively).
>
> I am going on the assumption that the uvmap pixel values represent the
> distorted location of that pixel, with the location being a floating point
> percentage of the frame width and height. So a value of 0.916767, 0.432918
> would basically be telling the STMap node to set the output pixel location
> for this pixel to a value of the difference between the 'unity' uvmap value
> that would result in no transformation and the sampled uv value, multiplied
> by the frame width.
>
> For horizontal distortion offset, this would be:
> (pixel_coordinate_x / frame_width - uvmap_red) * frame_width, or (1792 /
> 1920 - 0.916767) * 1920 = 31.807
> This would result in a distorted horizontal value of 1792+31.807 =
> 1823.807. This value is close, but almost 3 pixels off.
>
> *Help!*
> Can anyone here provide some insight into how exactly the math for the
> STMap works to determine the output location of a pixel from the incoming
> pixel values? I have attached a small nuke script demonstrating what I am
> talking about. See the "Test_STMAP_Distortion_Calculations" node to see
> the output results of the above algorithm.
>
> And if anyone is curious to check out the "DistortTracks" gizmo as it
> exists so far, it lives here <https://gist.github.com/jedypod/6302723>.
>
> Thanks very much!
>
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
> Attachments:
>  - inverse_UV.nk
>
>
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
_______________________________________________
Nuke-users mailing list
Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to