> I would have a question related to a code snippet he published on kineme.net > regarding the creation of custom protocols for QuartzComposer.
"Protocol" in QC has a very specific meaning. "Protocol" for sending data between machines isn't really related to QC "protocols." Creating custom QC protocols is unsupported. > Specifically, I would like to know if it is technically feasable to framegrab > the image of a composition, pack it into a custom protocol format (for which > I have the definition) and send it over a 100MBit network to another machine > (another end device which acts like a display). > > The end part (sending the image and displaying it) is in place and works, I > am more concerned in getting the data I want from Quartz composer and > repacking it into my custom protocol... Doing this should be fairly straightforward -- you can get an image of a composition in a variety of ways. I'd suggest creating your own CGLContextObj, creating an IOSurfaceRef, attaching it to an FBO (and supplying a depth buffer so QC renders depth stuff correctly), rendering QC into the context (which should render to the FBO), then lock the IOSurface, do whatever repacking you need to do with the CPU, and send it on its way. If your repacking is GPU-friendly, you could do the work on the IOSurface with the GPU, and then only lock it to scrape the repacked bits out to send over the wire. -- Christopher Wright [email protected]
_______________________________________________ Do not post admin requests to the list. They will be ignored. Quartzcomposer-dev mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com This email sent to [email protected]

