I'm just taking a few tentative steps into OS X
programming using XCode, but have a couple of ideas
for things I'd like to work on in the future, and was
just wondering if any of you more more experienced
programmers could maybe give me an idea if they could
be achieved relatively easily.

One of the things I'd really like to do is to create a
plugin for Quartz Composer that grabs an audio stream
from a particular source, and draws the volume levels
for each frames-worth of the audio as brightness
values across a 1D image strip of a given length.

So, in very simplified pseudocode, it would:

* Work out the current frame-rate
* Grab one frames-worth of audio
* Create 1D image of set length
* Scale audio information to appropriate number of
samples
* Write level information into image, with appropriate
scaling
* Pass image to QC

If there's already a way to do something similar
within QC, then I'd love to know what it is.

Cheers!

alx


PS
I have posted this also to the XCode Users list, so
apologies in advance if some of you receive this
twice.



      __________________________________________________________
Sent from Yahoo! Mail - a smarter inbox http://uk.mail.yahoo.com

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to