Brecht, >> One thing that I seem to observe is that the hit to camera needs to be >> pretty much hard-wired and the light to hit too. Is that right? > > It depends a bit, but if you want the rendering algorithm to support > all possible node setups, then yes that needs to be hard-wired. But if > it doesn't make sense for some algorithm to support e.g. a Light node, > then that can just be not supported there.
I think the way I see the BSDF node (or node setup) is pretty much unlinked to and from other nodes. Explicitly linking the light and the output on the BSDF seems of limited use. I understand that it can allow someone to experiment and invent creative ways of rendering but that is not an aspect I want to concentrate on for now. >> How are multiple-reflection bounces represented in a nodes setup? >> Implicitly >> or explicitly? > > Implicitly, bounces would use the BSDF. I agree. The BSDF node setup needs to be usable as standalone and just associated to a material that is associated to a surface. At least that is how I imagine its use in a general rendering algorithm. The node would then be called by the rendering algorithm through its association with the hit surface. I can imagine a standard interface between a node setup designed to be used as a BSDF. The outher shell-node would supply this interface and the user could go very wild inside the outher shell node and be very creative with the BSDF behavior. > Not sure > how to do emission yet, I guess there needs to be a separate output > for that as well in which a color would be plugged (or should that be > an EDF?). I can see a light node onto which we could link a color or a texture and a distribution function. But it gets more complicated when any object can become an emiter. An object can be both emissive with a specific emission pattern and its surface can be reflective with a different reflectance pattern. Normally, the light color is filtered by the object surface color. But you will probablay want to keep a maximum of flexibility and allow a very different emission color pattern from the surface color pattern. It is not obvious to me neither how to represent that with nodes. And uniformly sampling an arbitrary morphology object is not trivial. Although measured BSDF data are not common, measured EDF data are quite common these days and available from amany light fixture manufacturers. Otherwise, the directional emission pattern can be simulated with traditional distribution function just like on BSDFs. Also, there is the issue of directional lights (sun lights) in GI algorithms. Some thoughts need to be done on that. It is very innefficient to throw samples and hope to hit something usefull in situation where the sun is shining through windows. Metropolis Light Transport can handle those situation quite well but for other GI rendering algorithms, it is more efficient to associate sun lights with those windows in some ways. There come the concept of light port or photon ports. But how to represent that with nodes? >> BTW, I assume that the Light node represents any light in the scene. Or >> is >> one Light node associated to only one light in the scene? > > That would be configurable, the default would be all lights, but I > guess you would be able to set a light group. There too, I see the light node setup as being usable standalone, just associated to a material and a surface and be called by the rendering algorithm. In hard-wired light to BSDF node setups, an efficiency issue would be how to importance sample the lights in a scene with multiple lights, each with its own emission characteristic. Still the same question rises: How to represent that with nodes? Yves _______________________________________________ Bf-committers mailing list Bf-committers@blender.org http://lists.blender.org/mailman/listinfo/bf-committers