Hi everybody.
I guess it's not the best place to put my request but perhaps you can help with a CoreAudio / AudioUnit question.

I have an application, working fine at the moment, generating an audio buffer and send it to CoreAudio.

To do so I had to implement a call back and so on.
Until there, no probem.

Now I want to add a new extra feature to my application.

Actualy the application behave like this :

audioBuffer -> CoreAudio Output

However I want it to behave like this :

audioBuffer -> AudioUnit AUMatrixReverb -> CoreAudio Output.

I'd need more information concerning how to handle with AUGraph.

I looked at many sample but none gave me what I really expected.

At the same time, I also found that application are using, in the case of AUMatrixReverb, the same interface to handle it. Is there a way to retrieve this "setting window" so I can use it to control the added reverb.
Of course, being able to add it in a NSView would be nice too. ;)

Thank for your reading

Alex ROUGE

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to