As far as I have seen, I can only feed an NSImage or an CIImage to an image port.
You can also feed CVImageBuffers, CGImages, etc...
I've managed to get it done with an NSImage, but this is way too slow, as I have to copy the texture to a NSBitmapImageRep. Additionally the R and B channels were switched. :-/
that's an endian bug, make sure you copy the texture contents properly to the NSBitmapImageRep depending if you're running on PPC or X86.
So I like to get it to work with an CIImage, as this claims to be working directly with a texture that is already located in the VRAM of the graphics card.
This will only work _if_ the texture it points to is defined on the same GL context that the one QCRenderer has been initialized with (or is defined on a shared context with it). Otherwise, the texture is just not visible to QC.
You can also try drawing the texture you get in into a CVOpenGLBuffer and then pass that to QC. Although there's a VRAM copy, this might have better performances at the end.
________________________ Pierre-Olivier Latour [EMAIL PROTECTED]
_______________________________________________ Do not post admin requests to the list. They will be ignored. Quartzcomposer-dev mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com This email sent to [EMAIL PROTECTED]

