I used to reset all viewers to None when a script was prepared for sending to
the farm to work around this bug.
When switching to Ocio that was no problem anymore as the show specific config
is always set through the environment variable.
So what do you define in your menu.py, Howard?
Nathan Rus
The latest version of the convolve node has gpu support.
Am 20.03.2017 um 01:20 schrieb jon parker:
Greetings Nuke users,
I'm just wondering if there are any faster, more robust FFT tools
available for Nuke besides the (hidden) built-in nodes?
The built-ins do the job, but they are pretty slow
The button is executed in the context of the gizmo so allNodes returns the
nodes found inside it.
You have to switch to the root context first
with nuke.root().begin():
...
Kind regards,
Michael
Michael Hodges wrote:
>I’ve got a callback that affects a certain gizmo class recursively
>thr
Hi,
What do you mean by you didn't find any node? What about the ociocdltransform
and the ociofiletransform? They both work perfectly for me!
Kind regards,
Michael
Howard Jones wrote:
>Hi
>
>We have a CDL which we would like to get into Nuke.
>So far we can’t find a node that can read CDLs (J-O
Yes, and even if you have the additional bg samples behind your
defocused foreground bokeh is not using them.
Am 09.02.2017 um 19:04 schrieb Nathan Rusch:
This isn't too surprising. Your original deep data has discrete color
samples for each deep sample in places where objects overlap, but your
ys
>the
>same with the halo.
>Any other tips?
>
>Thanks
>Gabor
>
>
>On Thu, Feb 9, 2017 at 2:34 PM, Michael Habenicht
>wrote:
>
>> Hi,
>> do you have the 'target input alpha' checkbox on the DeepRecolor node
>> checked? Turning this option
Hi,
do you have the 'target input alpha' checkbox on the DeepRecolor node checked?
Turning this option on might solve your problem.
Kind regards,
Michael
"Gabor L. Toth" wrote:
>Hi,
>
>we are testing peregrine's bokeh in our pipeline. I would like to use
>deep
>input for perfect defocus. With
For a filmic dissolve you should convert it to log not video/gamma corrected
space.
Cheers!
Adrian Baltowski wrote:
>Hej
>
>Just gamma-correct things you want to dissolve: pump up gamma (with
>"Gamma" node for instance) before Dissolve - on both inputs- and invert
>gamma -correction after dis
Hallo Daniel,
you are missing the quotes: it has to be setValue("nuke.message('...')")
Otherwise the message is shown right away and the return value (None) of this
call is set. There you get the error because it has to be a string.
Viele Grüße,
Michael
On November 1, 2016 11:02:22 AM CET, Danie
Sure, just blur it before unpremultiplying and over the original afterwards.
That is the simplest and easiest way!
Best regards,
Michael
On September 20, 2016 9:48:58 PM CEST, Gary Jaeger wrote:
>Anybody have a good way of extending the edge pixels of a premulted
>image? for instance, if I had
If you have a knob called mix which contains the amount of one of the cameras
the expression should be this:
Camera4.world_matrix*mix+Camera5.world_matrix*(1-mix)
With this you should be able to mix between the two cmaera paths.
Best regards,
Michael
On September 20, 2016 10:21:04 PM CEST, Brun
As far as I know it is unfortunately not possible at all to get it without
shotgun ...
Michael Garrett wrote:
>Sorry for the OT, but has anyone else had problems with actually buying
>RV?
>It seems maybe the issue is something up with tickets in their support
>system not getting re-routed to th
ut them in Nuke.
>
>
>On Fri, May 6, 2016 at 12:46 PM, Randy Little
>wrote:
>
>> That's just crazy. I guess a lot of people are not currently happy
>with
>> this based on the academy color google group.
>>
>> Randy S. Little
>> http://www.rsli
The reference space in the aces 1.01 OCIO config is ACEScg so you first have to
convert to that. The compositing logspace is ACEScc which means the the
longconvert is converting from ACEScg to ACEScc. I don't know whether it is
going to change the result. Also be aware that you might get negativ
Hi,
you should put a lin2log conversion node before creating your lut and before
applying it. Usually luts convert from log space to their destination because
in this case there are no values above one. If it is not possible to convert
your source to log before applying the lut you have to live
Hi,
I don't think this is possible because you can only reference metadata which is
bound to your current shot. How would you know which shot object has the same
timing on another track?
One solution could be to add a tag to the shot that holds the information your
after as metadata. But I am no
hiero.ui.getTimelineEditor(hiero.ui.activeSequence()).selection()
theodor groeneboom wrote:
>Hiya list!
>
>Is there a nice and easy way to collect the currently selected track
>items in a sequence in NukeStudio ?
>
>like the good ol n = nuke.selectedNodes() ?
>
>I searched and looked for some
Well, it is how OCIO works and how the nuke-default is implemented. It is all
defined through 1D lookup tables. And that means it is defined only in a
limited range. In this case it is -0.125 - 1.125. So when you convert back from
linear it gets clamped to this range.
The colorspace node on the
Hi,
In composite mode check unpremultiplied output, put a premult node after it and
over it over your background. Not much more magic behind it!
Best regards,
Michael
On April 9, 2015 3:20:07 AM CST (China), Peter Sid wrote:
>Hey guys
>
>Does anyone have a setup or a description o
Hi,
I might overlook something here but I would just load in the alembic,
select the vertex at the position you are after and snap a nuke axis
there. If you need it to follow an animation use this script from
nukepedia: http://www.nukepedia.com/python/3d/animatedsnap3d
If you have only the ca
Hi John,
Do you have J-Ops on nuke 7 but not on nuke 8? Read nodes from recursing
folders is no feature out of the box.
J-Ops is the first tool that implements this which comes to my mind and is not
available for nuke 8. Although the python tools should still work as it's only
the plugins which
Hi,
You have to also specify the nodeclass: Viewer.viewerProcess
That way you overwrite the default that the ocio setup uses which is the
default display and default view.
Best regards,
Michael
"Neil Rögnvaldr Scholes" wrote:
>Hi
>
>I know how to tell Nuke to default to using OCIO spi-vfx on
I was also wondering why this would be better?! Just press Ctrl when clicking
on it to get the old one!
On February 26, 2014 8:13:11 PM CET, Feli wrote:
>I just got a look at the new color controls in Nuke8 and have a few
>questions:
>
>- I would like to make them bigger and place them in a floa
Did you try to write out an alembic? I guess that should work!
Best regards,
Michael
Ron Ganbar wrote:
>Hi there,
>is it possible to create a particle system in NukeX and then transfer
>it,
>as it is, to Nuke? For example, but writing the scene as an FBX or
>something?
>
>
>Ron Ganbar
>email: ro
Not that I know of. I would just use the Radial node.
Am 14.05.2013 20:48, schrieb Nick Guth:
I'm a bit embarrassed to ask, but is there a way to create perfectly centered
roto shapes in Nuke? In AE you can double click the shape and have it fill the
compositions format with the shape, but in
Set it to stabilize instead of matchmove or the other way round depending on
what you want to achieve.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitro
Hi Sean,
use the depth map of your particle render to displace your various 2d
noise maps in one eye horizontly to make it work in stereo.
Best regards,
Michael
Am 11.12.2012 18:13, schrieb Sean Falcon:
Hi All,
I have a simple particle system of dust floating in the air that I've created
i
The good thing about stereo in nuke is that you have both eyes in one stream
and therefor every node works on both eyes at the same time.
To use this you have to join your views with the JoinView node.
But I saw on your screenshots that you have the name of your view in the
layername which you ha
J-ops includes a plugin for this!
Vincent Langer wrote:
>hi there nuke-list,
>
>i was wondering if it is possible to merge different exposures of an
>image or an image-sequenz into one hdr file like in photoshop or ptgui
>
>or other hdr tools?
>
>cheers,
>vincent
>
>--
Hi,
did you set the metadata knob in the write node to "all metadata"?
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitron.de
--
As long as you are not using the "-i" option when starting the
commandline nuke it is using a render license.
Best regards,
Michael
Am 19.09.2012 23:30, schrieb Jellyman:
I'd like to send a script to render using a floating render only license
but can't find any documentation on how to do this
Hi,
just use exit instead of close. I don't know it any other way in any software.
When you close a file or project the software resets itself to default.
Best regards,
Michael
Happyrender wrote:
>Hello!
>
>I believe this is my first post here, so hello everybody!
>
>We've just adopted NukeX 6
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitron.de
--
- Original Message -
From: nuke-users-re...@thefoundry.co.uk
To: nuke-users@support.thefoundry.co.uk
Date: 25.07.2012 12:0
s,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitron.de
--
- Original Message -
From: nuke-users-re...@thefoundry.co.uk
To: nuke-users@support.thefoun
com/
On 10 June 2012 16:11, Michael Habenicht mailto:m...@tinitron.de>> wrote:
Hi Ron,
as it is only one transform you can calculate it with the matrix
of the transform node. I wrapped it in a NoOp here:
set cut_paste_input [stack 0]
versi
(0).knob('matrix').value().transform(nuke.math.Vector3(nuke.thisNode().knob('pos').value()\\\[0\\],\\
nuke.thisNode().knob('pos').value()\\\[1\\],\\ 0))\\\[1\\]]"}}
}
Connect the transform node to the input and set the pos knob to the
position you want to transform through
Hi Thomas,
you are right the pworld pass is already the first part. We have the
screen space and the coresponding world position. But to be able to
calculate the disparity you need the screen space position for this
particular point viewed through the second camera. It is possible to
calculat
d the sample position. Don't forget to do it both ways
from left to right and vice versa.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitron.de
-
. This should work
for any two cameras.
A disparity shader is built on the same concept.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitro
cameras to create
the disparity.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.de
m...@tinitron.de
--
- Original Message -
From: russia...@gmail.co
Hi,
that is what expressions are for. You can add one to the tile_color knob which
sets the color according to the mix value.
The callbacks are mostly to catch things the user did.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor
Yes, it does, tried it some weeks ago.
You just have to place it outside of the viewport. So the format defines your
0-1 range. If you want to put your image in the 1-2 range transform it to the
right by your whole width.
Best regards,
Michael
- Original Message -
From: masondo...@gmai
I had also problems to mask or stencil all layers of a stream, so I created
this handy gizmo:
http://www.nukepedia.com/gizmos/tnt_maskall/
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor & TD
http://www.tinitron.
write node for each eye. the oneview
is not necessary as long as you do not have any write node in the script that
is set to more than one eye, at least in my experience
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics
in the shuffle set all the other channels to black except for the one you want
to solo
do your transforms and other stuff on the channels
add them all back together
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics
In general it's easy, just evaluate the file knob and set it as new value:
filex = node['file'].evaluate()
node['file'].setValue(filex)
The problem is that the frame number also gets evaluated so you have to find a
way to bring back "%04d" or whatever framepadding you are using.
- Original
Use the hidden node PositionToPoints
press x, make sure tcl is selected and type "PositionToPoints"
Best regards,
Michael
- Original Message -
From: jorxs...@gmail.com
To: Nuke-users@support.thefoundry.co.uk
Date: 21.09.2011 04:39:09
Subject: [Nuke-users] renderable point clouds?
> hey
it as plugin for shake: Transform Coordinates
http://www.pixelmania.se/index.asp?page=shake/index.asp
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor
.value.left.g
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor & TD TRIXTER Film Munich
http://www.trixte
... and a gizmo is also only a nuke script, so you can always open it in a
text editor replace Gizmo with Group and copy and paste it ...
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
ant to use the Expression node maybe?
> To get a colour through python you'd have to use the node.sample()
>
>
> On Jul 26, 2011, at 11:04 PM, Michael Habenicht wrote:
>
> > Hello everybody,
> >
> > how can I pass red, green and blue values to a python func
the
expression node?
Or is there a better way?
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor & TD TRIXTER Film Munich
http://www.trixte
I think this gizmo is doing what you are looking for:
http://www.nukepedia.com/gizmos/tnt_line/
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
Hi,
this is the fast way to clone with expressions:
http://www.nukepedia.com/gizmos/python-scripts/nodegraph/nshakeclone/
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
rrect value or delete the whole line.
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor & TD TRIXTER Film Munich
http://www.t
Yes thats true, I did some tests a while ago and compared all formats
the GenerateLut node can export and only .cube and I think another one
where able to reproduce the exact grading. Unfortunately I had no time
yet to send all this testfiles to support to file a bug.
So the solution is just e
56 matches
Mail list logo