I am developing a simple audio unit host, specifically for music devices (soft synths).I have scoured the web and found many different examples, and have a working program, in that I can select and load a music device, display and manipulate the GUI, route MIDI data to it from an externally keyboard, and hear output audio via the user-selected device. However, I have a few issues that have me baffled at the moment, and any hints or advice would be greatly appreciated.
1. Some devices will exhibit stuck notes, although many others are solid with no stuck notes. I am doing minimal work on the real-time thread to hand off MIDI events and nothing else that should impact any real-time audio processing by the music device. Is there some process priority or other setting I should be trying? 2. I have one older music device (specifically the Roland SOUND Canvas VA) that works fine, except the level meters do not work. Am I missing a callback or some sort of view refresh logic needed to make this work? 3. I have a few devices (one example is the Martinic AX73) that will load fine and appear to respond to MIDI messages, but there is no output audio, and no errors from any of the core audio API calls to indicate what might be wrong. Thanks, Larry
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com This email sent to [email protected]
