Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread Randy Little
so its Normalized Cartesian coordiantes. Makes since since 0,0 should
be the the scene origin right? (it is in real optics world)

Randy S. Little
http://www.rslittle.com




On Sat, Mar 24, 2012 at 17:12, Ivan Busquets  wrote:
> Normalized Device Coordinates
>
> For prman, that's the viewing device (camera) frustum, normalized to a -1 to
> 1 box.
>
> As Michael said, the NDC matrix is what you would use to project a point
> into camera space, and then you'd use a camera-to-world matrix to get a
> coordinate in world space.
>
> On Mar 24, 2012 5:02 PM, "Randy Little"  wrote:
>>
>> what does ndc stand for (normal depth coord?)
>> Randy S. Little
>> http://www.rslittle.com
>>
>>
>>
>>
>> On Sat, Mar 24, 2012 at 15:01, Michael Garrett 
>> wrote:
>> > Right, from memory it's doing this:
>> > - construct ndc coordinates for x and y
>> > - use depth for position.z in camera space
>> > - use the camera projection matrix to convert ndc x and y to camera
>> > space
>> > position x and y
>> > - use camera transformation matrix to translate and rotate to position
>> > in
>> > world space.
>> >
>> > Michael
>> >
>> >
>> >
>> > On Mar 24, 2012, at 5:11 AM, ari Rubenstein  wrote:
>> >
>> > Ivan,
>> > If one doesn't alter camera clip planes and one understands inversion of
>> > varying depth between maya, nuke and such... otherwise it's pretty
>> > straightforward like Nathan's ?
>> >
>> > Ari
>> >
>> > Sent from my iPhone
>> >
>> > On Mar 23, 2012, at 8:44 PM, Michael Garrett 
>> > wrote:
>> >
>> > Totally agree it's made all the difference since the Shake days.  Thanks
>> > Ivan for contributing this (and all the other stuff!).
>> >
>> > Ari, I do have a gizmo version of a depth to Pworld conversion but it
>> > assumes raw planar depth from camera.  Though once you start factoring
>> > in
>> > near and far clipping planes, and different depth formats, it gets a bit
>> > more complicated.  Ivan may have something to say on this.
>> >
>> > Michael
>> >
>> >
>> >
>> > On 23 March 2012 03:16, ari Rubenstein  wrote:
>> >>
>> >> Wow, much appreciated.
>> >>
>> >>
>> >> Thinking back to how artists and studios in the film industry used to
>> >> hold
>> >> tight their techniques for leverage and advantage, it's great to see
>> >> how
>> >> much "this" comp community encourages and props up one another for
>> >> creative
>> >> advancement for all.
>> >>
>> >> Thanks again Ivan
>> >>
>> >> Ari
>> >>
>> >>
>> >> Sent from my iPhone
>> >>
>> >> On Mar 23, 2012, at 3:16 AM, Ivan Busquets 
>> >> wrote:
>> >>
>> >> Hey Ari,
>> >>
>> >> Here's the plugin I mentioned before.
>> >>
>> >> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>> >>
>> >> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but
>> >> I've
>> >> uploaded the source code as well, so someone else can compile it for
>> >> Windows
>> >> if needed
>> >>
>> >> Hope it proves useful.
>> >> Cheers,
>> >> Ivan
>> >>
>> >> On Wed, Mar 21, 2012 at 2:55 PM,  wrote:
>> >>>
>> >>> thanks Frank for the clarification.
>> >>>
>> >>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>> >>> wrapped into a tool as well, but I'd love to see your approach as
>> >>> well.
>> >>>
>> >>>
>> >>>
>> >>> , >>
>> >>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>> >>> >> animated, deformable piece of geo)... and your using a static
>> >>> >> camera
>> >>> >> (say
>> >>> >> a single frame of your shot camera)... is there a way to do
>> >>> >> something
>> >>> >> akin
>> >>> >> to Maya's "texture reference object" whereby the UV's are changed
>> >>> >> based
>> >>> >> on
>> >>> >> this static camera, for all the subsequent frames of the obj
>> >>> >> sequence
>> >>> >> ?
>> >>> >
>> >>> >
>> >>> > I've got a plugin that does exactly that. I'll see if I can share on
>> >>> > Nukepedia soon.
>> >>> >
>> >>> > Cheers,
>> >>> > Ivan
>> >>> >
>> >>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter
>> >>> > 
>> >>> > wrote:
>> >>> >
>> >>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left
>> >>> >> of
>> >>> >> the
>> >>> >> camera frustum and1,1 in the upper right.
>> >>> >>
>> >>> >> 2 - been waiting for that feature a long time ;).It should be
>> >>> >> logged
>> >>> >> as
>> >>> >> a
>> >>> >> feature request but would certainly be good to report again to make
>> >>> >> sure
>> >>> >> (and to push it in priority)
>> >>> >>
>> >>> >>
>> >>> >>
>> >>> >>
>> >>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>> >>> >>
>> >>> >>> couple more questions:
>> >>> >>>
>> >>> >>> 1)  if imported geo does not already have UV's, will UVproject
>> >>> >>> create
>> >>> >>> a
>> >>> >>> new set or does it require them to...replace them ?
>> >>> >>>
>> >>> >>> 2)  if you've imported an obj sequence with UV's already on (for
>> >>> >>> an
>> >>> >>> animated, deformable piece of geo)... and your using a static
>> >>> >>> camera
>> >>> >>> (say
>> >>> >>> a single frame of your shot camera)... is there a way to do
>> >>> 

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread Ivan Busquets
Normalized Device Coordinates

For prman, that's the viewing device (camera) frustum, normalized to a -1
to 1 box.

As Michael said, the NDC matrix is what you would use to project a point
into camera space, and then you'd use a camera-to-world matrix to get a
coordinate in world space.
 On Mar 24, 2012 5:02 PM, "Randy Little"  wrote:

> what does ndc stand for (normal depth coord?)
> Randy S. Little
> http://www.rslittle.com
>
>
>
>
> On Sat, Mar 24, 2012 at 15:01, Michael Garrett 
> wrote:
> > Right, from memory it's doing this:
> > - construct ndc coordinates for x and y
> > - use depth for position.z in camera space
> > - use the camera projection matrix to convert ndc x and y to camera space
> > position x and y
> > - use camera transformation matrix to translate and rotate to position in
> > world space.
> >
> > Michael
> >
> >
> >
> > On Mar 24, 2012, at 5:11 AM, ari Rubenstein  wrote:
> >
> > Ivan,
> > If one doesn't alter camera clip planes and one understands inversion of
> > varying depth between maya, nuke and such... otherwise it's pretty
> > straightforward like Nathan's ?
> >
> > Ari
> >
> > Sent from my iPhone
> >
> > On Mar 23, 2012, at 8:44 PM, Michael Garrett 
> wrote:
> >
> > Totally agree it's made all the difference since the Shake days.  Thanks
> > Ivan for contributing this (and all the other stuff!).
> >
> > Ari, I do have a gizmo version of a depth to Pworld conversion but it
> > assumes raw planar depth from camera.  Though once you start factoring in
> > near and far clipping planes, and different depth formats, it gets a bit
> > more complicated.  Ivan may have something to say on this.
> >
> > Michael
> >
> >
> >
> > On 23 March 2012 03:16, ari Rubenstein  wrote:
> >>
> >> Wow, much appreciated.
> >>
> >>
> >> Thinking back to how artists and studios in the film industry used to
> hold
> >> tight their techniques for leverage and advantage, it's great to see how
> >> much "this" comp community encourages and props up one another for
> creative
> >> advancement for all.
> >>
> >> Thanks again Ivan
> >>
> >> Ari
> >>
> >>
> >> Sent from my iPhone
> >>
> >> On Mar 23, 2012, at 3:16 AM, Ivan Busquets 
> wrote:
> >>
> >> Hey Ari,
> >>
> >> Here's the plugin I mentioned before.
> >>
> >> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
> >>
> >> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but
> I've
> >> uploaded the source code as well, so someone else can compile it for
> Windows
> >> if needed
> >>
> >> Hope it proves useful.
> >> Cheers,
> >> Ivan
> >>
> >> On Wed, Mar 21, 2012 at 2:55 PM,  wrote:
> >>>
> >>> thanks Frank for the clarification.
> >>>
> >>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
> >>> wrapped into a tool as well, but I'd love to see your approach as well.
> >>>
> >>>
> >>>
> >>> , >>
> >>> >> 2)  if you've imported an obj sequence with UV's already on (for an
> >>> >> animated, deformable piece of geo)... and your using a static camera
> >>> >> (say
> >>> >> a single frame of your shot camera)... is there a way to do
> something
> >>> >> akin
> >>> >> to Maya's "texture reference object" whereby the UV's are changed
> >>> >> based
> >>> >> on
> >>> >> this static camera, for all the subsequent frames of the obj
> sequence
> >>> >> ?
> >>> >
> >>> >
> >>> > I've got a plugin that does exactly that. I'll see if I can share on
> >>> > Nukepedia soon.
> >>> >
> >>> > Cheers,
> >>> > Ivan
> >>> >
> >>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter  >
> >>> > wrote:
> >>> >
> >>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left
> of
> >>> >> the
> >>> >> camera frustum and1,1 in the upper right.
> >>> >>
> >>> >> 2 - been waiting for that feature a long time ;).It should be logged
> >>> >> as
> >>> >> a
> >>> >> feature request but would certainly be good to report again to make
> >>> >> sure
> >>> >> (and to push it in priority)
> >>> >>
> >>> >>
> >>> >>
> >>> >>
> >>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
> >>> >>
> >>> >>> couple more questions:
> >>> >>>
> >>> >>> 1)  if imported geo does not already have UV's, will UVproject
> create
> >>> >>> a
> >>> >>> new set or does it require them to...replace them ?
> >>> >>>
> >>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
> >>> >>> animated, deformable piece of geo)... and your using a static
> camera
> >>> >>> (say
> >>> >>> a single frame of your shot camera)... is there a way to do
> something
> >>> >>> akin
> >>> >>> to Maya's "texture reference object" whereby the UV's are changed
> >>> >>> based
> >>> >>> on
> >>> >>> this static camera, for all the subsequent frames of the obj
> sequence
> >>> >>> ?
> >>> >>>
> >>> >>> ..sorry if I'm too verbose...that was sort of a stream of
> >>> >>> consciousness
> >>> >>> question.  Basically I'm asking if there is an easier way then my
> >>> >>> current
> >>> >>> method where I export an obj sequence with UV's, project3D on a
> >>> >>> single
> >>> >>

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread Randy Little
what does ndc stand for (normal depth coord?)
Randy S. Little
http://www.rslittle.com




On Sat, Mar 24, 2012 at 15:01, Michael Garrett  wrote:
> Right, from memory it's doing this:
> - construct ndc coordinates for x and y
> - use depth for position.z in camera space
> - use the camera projection matrix to convert ndc x and y to camera space
> position x and y
> - use camera transformation matrix to translate and rotate to position in
> world space.
>
> Michael
>
>
>
> On Mar 24, 2012, at 5:11 AM, ari Rubenstein  wrote:
>
> Ivan,
> If one doesn't alter camera clip planes and one understands inversion of
> varying depth between maya, nuke and such... otherwise it's pretty
> straightforward like Nathan's ?
>
> Ari
>
> Sent from my iPhone
>
> On Mar 23, 2012, at 8:44 PM, Michael Garrett  wrote:
>
> Totally agree it's made all the difference since the Shake days.  Thanks
> Ivan for contributing this (and all the other stuff!).
>
> Ari, I do have a gizmo version of a depth to Pworld conversion but it
> assumes raw planar depth from camera.  Though once you start factoring in
> near and far clipping planes, and different depth formats, it gets a bit
> more complicated.  Ivan may have something to say on this.
>
> Michael
>
>
>
> On 23 March 2012 03:16, ari Rubenstein  wrote:
>>
>> Wow, much appreciated.
>>
>>
>> Thinking back to how artists and studios in the film industry used to hold
>> tight their techniques for leverage and advantage, it's great to see how
>> much "this" comp community encourages and props up one another for creative
>> advancement for all.
>>
>> Thanks again Ivan
>>
>> Ari
>>
>>
>> Sent from my iPhone
>>
>> On Mar 23, 2012, at 3:16 AM, Ivan Busquets  wrote:
>>
>> Hey Ari,
>>
>> Here's the plugin I mentioned before.
>>
>> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>>
>> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but I've
>> uploaded the source code as well, so someone else can compile it for Windows
>> if needed
>>
>> Hope it proves useful.
>> Cheers,
>> Ivan
>>
>> On Wed, Mar 21, 2012 at 2:55 PM,  wrote:
>>>
>>> thanks Frank for the clarification.
>>>
>>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>>> wrapped into a tool as well, but I'd love to see your approach as well.
>>>
>>>
>>>
>>> , >>
>>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >> animated, deformable piece of geo)... and your using a static camera
>>> >> (say
>>> >> a single frame of your shot camera)... is there a way to do something
>>> >> akin
>>> >> to Maya's "texture reference object" whereby the UV's are changed
>>> >> based
>>> >> on
>>> >> this static camera, for all the subsequent frames of the obj sequence
>>> >> ?
>>> >
>>> >
>>> > I've got a plugin that does exactly that. I'll see if I can share on
>>> > Nukepedia soon.
>>> >
>>> > Cheers,
>>> > Ivan
>>> >
>>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter 
>>> > wrote:
>>> >
>>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left of
>>> >> the
>>> >> camera frustum and1,1 in the upper right.
>>> >>
>>> >> 2 - been waiting for that feature a long time ;).It should be logged
>>> >> as
>>> >> a
>>> >> feature request but would certainly be good to report again to make
>>> >> sure
>>> >> (and to push it in priority)
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>>> >>
>>> >>> couple more questions:
>>> >>>
>>> >>> 1)  if imported geo does not already have UV's, will UVproject create
>>> >>> a
>>> >>> new set or does it require them to...replace them ?
>>> >>>
>>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >>> animated, deformable piece of geo)... and your using a static camera
>>> >>> (say
>>> >>> a single frame of your shot camera)... is there a way to do something
>>> >>> akin
>>> >>> to Maya's "texture reference object" whereby the UV's are changed
>>> >>> based
>>> >>> on
>>> >>> this static camera, for all the subsequent frames of the obj sequence
>>> >>> ?
>>> >>>
>>> >>> ..sorry if I'm too verbose...that was sort of a stream of
>>> >>> consciousness
>>> >>> question.  Basically I'm asking if there is an easier way then my
>>> >>> current
>>> >>> method where I export an obj sequence with UV's, project3D on a
>>> >>> single
>>> >>> frame, render with scanline to unwrapped UV, then input that into the
>>> >>> full
>>> >>> obj sequence to get my "paint" to stick throughout.
>>> >>>
>>> >>> oy, sorry again.
>>> >>>
>>> >>> Ari
>>> >>> Blue Sky
>>> >>>
>>> >>>
>>> >>>
>>> >>>  ivanbusquets wrote:
>>> 
>>> > You can think of UVProject as a "baked" or "sticky" projection.
>>> >
>>> > The main difference is how they'll behave if you transform/deform
>>> > your
>>> > geometry AFTER your projection.
>>> >
>>> > UVProject "bakes" the UV values into each vertex, so if you
>>> > transform
>>> > those vertices later on, they'll still pull the textures f

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread Michael Garrett
Right, from memory it's doing this:
- construct ndc coordinates for x and y
- use depth for position.z in camera space
- use the camera projection matrix to convert ndc x and y to camera space 
position x and y
- use camera transformation matrix to translate and rotate to position in world 
space.

Michael



On Mar 24, 2012, at 5:11 AM, ari Rubenstein  wrote:

> Ivan,
> If one doesn't alter camera clip planes and one understands inversion of 
> varying depth between maya, nuke and such... otherwise it's pretty 
> straightforward like Nathan's ?
> 
> Ari
> 
> Sent from my iPhone
> 
> On Mar 23, 2012, at 8:44 PM, Michael Garrett  wrote:
> 
>> Totally agree it's made all the difference since the Shake days.  Thanks 
>> Ivan for contributing this (and all the other stuff!).
>> 
>> Ari, I do have a gizmo version of a depth to Pworld conversion but it 
>> assumes raw planar depth from camera.  Though once you start factoring in 
>> near and far clipping planes, and different depth formats, it gets a bit 
>> more complicated.  Ivan may have something to say on this.
>> 
>> Michael
>> 
>> 
>> 
>> On 23 March 2012 03:16, ari Rubenstein  wrote:
>> Wow, much appreciated.
>> 
>> 
>> Thinking back to how artists and studios in the film industry used to hold 
>> tight their techniques for leverage and advantage, it's great to see how 
>> much "this" comp community encourages and props up one another for creative 
>> advancement for all. 
>> 
>> Thanks again Ivan
>> 
>> Ari
>> 
>> 
>> Sent from my iPhone
>> 
>> On Mar 23, 2012, at 3:16 AM, Ivan Busquets  wrote:
>> 
>>> Hey Ari,
>>> 
>>> Here's the plugin I mentioned before.
>>> 
>>> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>>> 
>>> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but I've 
>>> uploaded the source code as well, so someone else can compile it for 
>>> Windows if needed
>>> 
>>> Hope it proves useful.
>>> Cheers,
>>> Ivan
>>> 
>>> On Wed, Mar 21, 2012 at 2:55 PM,  wrote:
>>> thanks Frank for the clarification.
>>> 
>>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>>> wrapped into a tool as well, but I'd love to see your approach as well.
>>> 
>>> 
>>> 
>>> , >>
>>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >> animated, deformable piece of geo)... and your using a static camera
>>> >> (say
>>> >> a single frame of your shot camera)... is there a way to do something
>>> >> akin
>>> >> to Maya's "texture reference object" whereby the UV's are changed based
>>> >> on
>>> >> this static camera, for all the subsequent frames of the obj sequence ?
>>> >
>>> >
>>> > I've got a plugin that does exactly that. I'll see if I can share on
>>> > Nukepedia soon.
>>> >
>>> > Cheers,
>>> > Ivan
>>> >
>>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter 
>>> > wrote:
>>> >
>>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left of
>>> >> the
>>> >> camera frustum and1,1 in the upper right.
>>> >>
>>> >> 2 - been waiting for that feature a long time ;).It should be logged as
>>> >> a
>>> >> feature request but would certainly be good to report again to make sure
>>> >> (and to push it in priority)
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>>> >>
>>> >>> couple more questions:
>>> >>>
>>> >>> 1)  if imported geo does not already have UV's, will UVproject create a
>>> >>> new set or does it require them to...replace them ?
>>> >>>
>>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >>> animated, deformable piece of geo)... and your using a static camera
>>> >>> (say
>>> >>> a single frame of your shot camera)... is there a way to do something
>>> >>> akin
>>> >>> to Maya's "texture reference object" whereby the UV's are changed based
>>> >>> on
>>> >>> this static camera, for all the subsequent frames of the obj sequence ?
>>> >>>
>>> >>> ..sorry if I'm too verbose...that was sort of a stream of consciousness
>>> >>> question.  Basically I'm asking if there is an easier way then my
>>> >>> current
>>> >>> method where I export an obj sequence with UV's, project3D on a single
>>> >>> frame, render with scanline to unwrapped UV, then input that into the
>>> >>> full
>>> >>> obj sequence to get my "paint" to stick throughout.
>>> >>>
>>> >>> oy, sorry again.
>>> >>>
>>> >>> Ari
>>> >>> Blue Sky
>>> >>>
>>> >>>
>>> >>>
>>> >>>  ivanbusquets wrote:
>>> 
>>> > You can think of UVProject as a "baked" or "sticky" projection.
>>> >
>>> > The main difference is how they'll behave if you transform/deform
>>> > your
>>> > geometry AFTER your projection.
>>> >
>>> > UVProject "bakes" the UV values into each vertex, so if you transform
>>> > those vertices later on, they'll still pull the textures from the
>>> > same
>>> > coordinate.
>>> >
>>> > The other difference between UVProject and Project3D is how they
>>> > behave
>>> > when the aspect ratio of the

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread ari Rubenstein
Sorry, that depth conversion question was for Michael, not Ivan... wrote too 
fast.

Ari

Sent from my iPhone

On Mar 23, 2012, at 8:44 PM, Michael Garrett  wrote:

> Totally agree it's made all the difference since the Shake days.  Thanks Ivan 
> for contributing this (and all the other stuff!).
> 
> Ari, I do have a gizmo version of a depth to Pworld conversion but it assumes 
> raw planar depth from camera.  Though once you start factoring in near and 
> far clipping planes, and different depth formats, it gets a bit more 
> complicated.  Ivan may have something to say on this.
> 
> Michael
> 
> 
> 
> On 23 March 2012 03:16, ari Rubenstein  wrote:
> Wow, much appreciated.
> 
> 
> Thinking back to how artists and studios in the film industry used to hold 
> tight their techniques for leverage and advantage, it's great to see how much 
> "this" comp community encourages and props up one another for creative 
> advancement for all. 
> 
> Thanks again Ivan
> 
> Ari
> 
> 
> Sent from my iPhone
> 
> On Mar 23, 2012, at 3:16 AM, Ivan Busquets  wrote:
> 
>> Hey Ari,
>> 
>> Here's the plugin I mentioned before.
>> 
>> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>> 
>> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but I've 
>> uploaded the source code as well, so someone else can compile it for Windows 
>> if needed
>> 
>> Hope it proves useful.
>> Cheers,
>> Ivan
>> 
>> On Wed, Mar 21, 2012 at 2:55 PM,   wrote:
>> thanks Frank for the clarification.
>> 
>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>> wrapped into a tool as well, but I'd love to see your approach as well.
>> 
>> 
>> 
>> , >>
>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>> >> animated, deformable piece of geo)... and your using a static camera
>> >> (say
>> >> a single frame of your shot camera)... is there a way to do something
>> >> akin
>> >> to Maya's "texture reference object" whereby the UV's are changed based
>> >> on
>> >> this static camera, for all the subsequent frames of the obj sequence ?
>> >
>> >
>> > I've got a plugin that does exactly that. I'll see if I can share on
>> > Nukepedia soon.
>> >
>> > Cheers,
>> > Ivan
>> >
>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter 
>> > wrote:
>> >
>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left of
>> >> the
>> >> camera frustum and1,1 in the upper right.
>> >>
>> >> 2 - been waiting for that feature a long time ;).It should be logged as
>> >> a
>> >> feature request but would certainly be good to report again to make sure
>> >> (and to push it in priority)
>> >>
>> >>
>> >>
>> >>
>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>> >>
>> >>> couple more questions:
>> >>>
>> >>> 1)  if imported geo does not already have UV's, will UVproject create a
>> >>> new set or does it require them to...replace them ?
>> >>>
>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
>> >>> animated, deformable piece of geo)... and your using a static camera
>> >>> (say
>> >>> a single frame of your shot camera)... is there a way to do something
>> >>> akin
>> >>> to Maya's "texture reference object" whereby the UV's are changed based
>> >>> on
>> >>> this static camera, for all the subsequent frames of the obj sequence ?
>> >>>
>> >>> ..sorry if I'm too verbose...that was sort of a stream of consciousness
>> >>> question.  Basically I'm asking if there is an easier way then my
>> >>> current
>> >>> method where I export an obj sequence with UV's, project3D on a single
>> >>> frame, render with scanline to unwrapped UV, then input that into the
>> >>> full
>> >>> obj sequence to get my "paint" to stick throughout.
>> >>>
>> >>> oy, sorry again.
>> >>>
>> >>> Ari
>> >>> Blue Sky
>> >>>
>> >>>
>> >>>
>> >>>  ivanbusquets wrote:
>> 
>> > You can think of UVProject as a "baked" or "sticky" projection.
>> >
>> > The main difference is how they'll behave if you transform/deform
>> > your
>> > geometry AFTER your projection.
>> >
>> > UVProject "bakes" the UV values into each vertex, so if you transform
>> > those vertices later on, they'll still pull the textures from the
>> > same
>> > coordinate.
>> >
>> > The other difference between UVProject and Project3D is how they
>> > behave
>> > when the aspect ratio of the camera window is different than the
>> > aspect
>> > ratio of the projected image.
>> > With UVProject, projection is defined by both the horizontal and
>> > vertical aperture. Project3D only takes the horizontal aperture, and
>> > preserves the aspect ratio of whatever image you're projecting.
>> >
>> >
>> > Hope that makes sense.
>> >
>> > Cheers,
>> > Ivan
>> >
>> >
>> > On Tue, Mar 20, 2012 at 4:50 PM, coolchipper  wrote:
>> >
>> > hey Nukers, may be a very basic question, but i
>> > wanted
>> >> to know
>> >>

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread ari Rubenstein
Ivan,
If one doesn't alter camera clip planes and one understands inversion of 
varying depth between maya, nuke and such... otherwise it's pretty 
straightforward like Nathan's ?

Ari

Sent from my iPhone

On Mar 23, 2012, at 8:44 PM, Michael Garrett  wrote:

> Totally agree it's made all the difference since the Shake days.  Thanks Ivan 
> for contributing this (and all the other stuff!).
> 
> Ari, I do have a gizmo version of a depth to Pworld conversion but it assumes 
> raw planar depth from camera.  Though once you start factoring in near and 
> far clipping planes, and different depth formats, it gets a bit more 
> complicated.  Ivan may have something to say on this.
> 
> Michael
> 
> 
> 
> On 23 March 2012 03:16, ari Rubenstein  wrote:
> Wow, much appreciated.
> 
> 
> Thinking back to how artists and studios in the film industry used to hold 
> tight their techniques for leverage and advantage, it's great to see how much 
> "this" comp community encourages and props up one another for creative 
> advancement for all. 
> 
> Thanks again Ivan
> 
> Ari
> 
> 
> Sent from my iPhone
> 
> On Mar 23, 2012, at 3:16 AM, Ivan Busquets  wrote:
> 
>> Hey Ari,
>> 
>> Here's the plugin I mentioned before.
>> 
>> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>> 
>> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but I've 
>> uploaded the source code as well, so someone else can compile it for Windows 
>> if needed
>> 
>> Hope it proves useful.
>> Cheers,
>> Ivan
>> 
>> On Wed, Mar 21, 2012 at 2:55 PM,   wrote:
>> thanks Frank for the clarification.
>> 
>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>> wrapped into a tool as well, but I'd love to see your approach as well.
>> 
>> 
>> 
>> , >>
>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>> >> animated, deformable piece of geo)... and your using a static camera
>> >> (say
>> >> a single frame of your shot camera)... is there a way to do something
>> >> akin
>> >> to Maya's "texture reference object" whereby the UV's are changed based
>> >> on
>> >> this static camera, for all the subsequent frames of the obj sequence ?
>> >
>> >
>> > I've got a plugin that does exactly that. I'll see if I can share on
>> > Nukepedia soon.
>> >
>> > Cheers,
>> > Ivan
>> >
>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter 
>> > wrote:
>> >
>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left of
>> >> the
>> >> camera frustum and1,1 in the upper right.
>> >>
>> >> 2 - been waiting for that feature a long time ;).It should be logged as
>> >> a
>> >> feature request but would certainly be good to report again to make sure
>> >> (and to push it in priority)
>> >>
>> >>
>> >>
>> >>
>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>> >>
>> >>> couple more questions:
>> >>>
>> >>> 1)  if imported geo does not already have UV's, will UVproject create a
>> >>> new set or does it require them to...replace them ?
>> >>>
>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
>> >>> animated, deformable piece of geo)... and your using a static camera
>> >>> (say
>> >>> a single frame of your shot camera)... is there a way to do something
>> >>> akin
>> >>> to Maya's "texture reference object" whereby the UV's are changed based
>> >>> on
>> >>> this static camera, for all the subsequent frames of the obj sequence ?
>> >>>
>> >>> ..sorry if I'm too verbose...that was sort of a stream of consciousness
>> >>> question.  Basically I'm asking if there is an easier way then my
>> >>> current
>> >>> method where I export an obj sequence with UV's, project3D on a single
>> >>> frame, render with scanline to unwrapped UV, then input that into the
>> >>> full
>> >>> obj sequence to get my "paint" to stick throughout.
>> >>>
>> >>> oy, sorry again.
>> >>>
>> >>> Ari
>> >>> Blue Sky
>> >>>
>> >>>
>> >>>
>> >>>  ivanbusquets wrote:
>> 
>> > You can think of UVProject as a "baked" or "sticky" projection.
>> >
>> > The main difference is how they'll behave if you transform/deform
>> > your
>> > geometry AFTER your projection.
>> >
>> > UVProject "bakes" the UV values into each vertex, so if you transform
>> > those vertices later on, they'll still pull the textures from the
>> > same
>> > coordinate.
>> >
>> > The other difference between UVProject and Project3D is how they
>> > behave
>> > when the aspect ratio of the camera window is different than the
>> > aspect
>> > ratio of the projected image.
>> > With UVProject, projection is defined by both the horizontal and
>> > vertical aperture. Project3D only takes the horizontal aperture, and
>> > preserves the aspect ratio of whatever image you're projecting.
>> >
>> >
>> > Hope that makes sense.
>> >
>> > Cheers,
>> > Ivan
>> >
>> >
>> > On Tue, Mar 20, 2012 at 4:50 PM, coolchipper  wrote:
>> >
>> >

Re: [Nuke-users] Re: difference between uv project and project 3d

2012-03-24 Thread Ivan Busquets
Thanks guys, I also think we have a great community, and it's a pleasure to
share when possible, as much as it is to learn from everyone who
participates.

@Thorsten: thanks for the Windows compile. I'll upload it to Nukepedia as
well.

Cheers,
Ivan

On Fri, Mar 23, 2012 at 5:44 PM, Michael Garrett wrote:

> Totally agree it's made all the difference since the Shake days.  Thanks
> Ivan for contributing this (and all the other stuff!).
>
> Ari, I do have a gizmo version of a depth to Pworld conversion but it
> assumes raw planar depth from camera.  Though once you start factoring in
> near and far clipping planes, and different depth formats, it gets a bit
> more complicated.  Ivan may have something to say on this.
>
> Michael
>
>
>
>
> On 23 March 2012 03:16, ari Rubenstein  wrote:
>
>> Wow, much appreciated.
>>
>>
>> Thinking back to how artists and studios in the film industry used to
>> hold tight their techniques for leverage and advantage, it's great to see
>> how much "this" comp community encourages and props up one another for
>> creative advancement for all.
>>
>> Thanks again Ivan
>>
>> Ari
>>
>>
>> Sent from my iPhone
>>
>> On Mar 23, 2012, at 3:16 AM, Ivan Busquets 
>> wrote:
>>
>> Hey Ari,
>>
>> Here's the plugin I mentioned before.
>>
>> http://www.nukepedia.com/gizmos/plugins/3d/stickyproject/
>>
>> There's only compiled versions for Nuke 6.3 (MacOS and Linux64), but I've
>> uploaded the source code as well, so someone else can compile it for
>> Windows if needed
>>
>> Hope it proves useful.
>> Cheers,
>> Ivan
>>
>> On Wed, Mar 21, 2012 at 2:55 PM,  wrote:
>>
>>> thanks Frank for the clarification.
>>>
>>> thanks Ivan for digging that plugin up if ya can.  i have a solution I
>>> wrapped into a tool as well, but I'd love to see your approach as well.
>>>
>>>
>>>
>>> , >>
>>> >> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >> animated, deformable piece of geo)... and your using a static camera
>>> >> (say
>>> >> a single frame of your shot camera)... is there a way to do something
>>> >> akin
>>> >> to Maya's "texture reference object" whereby the UV's are changed
>>> based
>>> >> on
>>> >> this static camera, for all the subsequent frames of the obj sequence
>>> ?
>>> >
>>> >
>>> > I've got a plugin that does exactly that. I'll see if I can share on
>>> > Nukepedia soon.
>>> >
>>> > Cheers,
>>> > Ivan
>>> >
>>> > On Wed, Mar 21, 2012 at 2:31 PM, Frank Rueter 
>>> > wrote:
>>> >
>>> >> 1 - UVProject creates UVs from scratch, with 0,0 in the lower left of
>>> >> the
>>> >> camera frustum and1,1 in the upper right.
>>> >>
>>> >> 2 - been waiting for that feature a long time ;).It should be logged
>>> as
>>> >> a
>>> >> feature request but would certainly be good to report again to make
>>> sure
>>> >> (and to push it in priority)
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On 3/22/12 8:28 AM, a...@curvstudios.com wrote:
>>> >>
>>> >>> couple more questions:
>>> >>>
>>> >>> 1)  if imported geo does not already have UV's, will UVproject
>>> create a
>>> >>> new set or does it require them to...replace them ?
>>> >>>
>>> >>> 2)  if you've imported an obj sequence with UV's already on (for an
>>> >>> animated, deformable piece of geo)... and your using a static camera
>>> >>> (say
>>> >>> a single frame of your shot camera)... is there a way to do something
>>> >>> akin
>>> >>> to Maya's "texture reference object" whereby the UV's are changed
>>> based
>>> >>> on
>>> >>> this static camera, for all the subsequent frames of the obj
>>> sequence ?
>>> >>>
>>> >>> ..sorry if I'm too verbose...that was sort of a stream of
>>> consciousness
>>> >>> question.  Basically I'm asking if there is an easier way then my
>>> >>> current
>>> >>> method where I export an obj sequence with UV's, project3D on a
>>> single
>>> >>> frame, render with scanline to unwrapped UV, then input that into the
>>> >>> full
>>> >>> obj sequence to get my "paint" to stick throughout.
>>> >>>
>>> >>> oy, sorry again.
>>> >>>
>>> >>> Ari
>>> >>> Blue Sky
>>> >>>
>>> >>>
>>> >>>
>>> >>>  ivanbusquets wrote:
>>> 
>>> > You can think of UVProject as a "baked" or "sticky" projection.
>>> >
>>> > The main difference is how they'll behave if you transform/deform
>>> > your
>>> > geometry AFTER your projection.
>>> >
>>> > UVProject "bakes" the UV values into each vertex, so if you
>>> transform
>>> > those vertices later on, they'll still pull the textures from the
>>> > same
>>> > coordinate.
>>> >
>>> > The other difference between UVProject and Project3D is how they
>>> > behave
>>> > when the aspect ratio of the camera window is different than the
>>> > aspect
>>> > ratio of the projected image.
>>> > With UVProject, projection is defined by both the horizontal and
>>> > vertical aperture. Project3D only takes the horizontal aperture,
>>> and
>>> > preserves the aspect ratio of whatever image you're projecting.
>>> >
>>> >
>>>