Hi Kevin,
Thanks for the info. Yes, I’m using midiOutputEventBlock() (actually via the
JUCE library but I’ve confirmed all this in the Xcode debugger) to send MIDI
from AUv3. That part works as expected when Logic is recording, we can record
the MIDI from the AUv3 into the Logic timeline.
I’ve added a small hack such that when the render callback is inactive (via a
basic flag set true in the render callback), to run a dispatch based timer to
gather the MIDI events from the UI and send them out. It’s a hack, it looks
like below – however – when the AUv3 is sleeping, Logic is not waking the
plugin until the transport starts or via external MIDI. So is the MIDI
basically going to the wrong port or Logic is not seeing these events?
Thanks
peter
// create a timer to dispatch the plugin's generated MIDI events
#if JucePlugin_ProducesMidiOutput
if (@available (macOS 10.13, iOS 11.0, *)) {
midiOutputEventBlock = [au MIDIOutputEventBlock];
uint64_t duration = NSEC_PER_SEC / 60;
dispatch_queue_t main_q = dispatch_get_main_queue();
dispatch_source_t src =
dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, main_q);
dispatch_source_set_timer(src, 0, duration, 0);
dispatch_source_set_event_handler(src, ^{
// when the render callback goes active, cancel this timer
if (active) {
dispatch_source_cancel(src);
dispatch_release(src);
} else {
MidiBuffer midiBuffer;
extern void
CoreMIDIRenderer_processNextMidiBuffer(MidiBuffer& buffer, const int
startSample, const int numSamples);
CoreMIDIRenderer_processNextMidiBuffer(midiBuffer, 0, 512);
// send MIDI
if (auto midiOut = midiOutputEventBlock) {
for (const auto metadata : midiBuffer) {
midiOut((int64_t) metadata.samplePosition +
(int64_t) (mach_absolute_time() + 0.5),
0,
metadata.numBytes,
metadata.data);
}
}
}
});
dispatch_resume(src);
}
#endif
From: Kevin Nelson <[email protected]>
Date: Tuesday, 27 February 2024 at 01:46
To: Markus Fritze <[email protected]>
Cc: support (One Red Dog Media) <[email protected]>, CoreAudio API
<[email protected]>
Subject: Re: AUv3 is silent or suspended
Isn’t this possible since a few years now with AUv3 and MIDI output recording
from Instrument types? Béla Balazs gave a WWDC talk about this with GarageBand
on iOS in either 2017 or 2018, but I can’t seem to find the video now.
Peter, are you using either the AU’s
midiOutputEventBlock<https://developer.apple.com/documentation/audiotoolbox/auaudiounit/2866003-midioutputeventblock>
or
scheduleMIDIEventBlock<https://developer.apple.com/documentation/audiotoolbox/auaudiounit/1387576-schedulemidieventblock>
to trigger MIDI from the plugin’s own virtual keyboard? If you’re not, using
this may solve your problem, and will have the added benefit that users will
also be able to record their performances using the plugin’s keyboard.
You can use the current "Audio Unit Extension App” Xcode templates as a
reference point for this, particularly the MIDI Effect example, which
illustrates sending note-on and -off events from the DSP Kernel’s `process`
block (search for `sendNoteOn` in the generated project). Of course, in an
Instrument, you won’t be able to use the process block to send MIDI for exactly
the reason that Markus mentioned, so you’ll need to use a different mechanism
to connect your keyboard UI to the AUAudioUnit’s `midiOutputEventBlock`.
Best regards,
Kevin
On Feb 23, 2024, at 12:32, Markus Fritze via Coreaudio-api
<[email protected]> wrote:
In your case you do not have a generator plugin, but rather an instrument or
effect and these plugin types are only processed by Logic Pro when needed,
meaning: when Logic knows that there is something to process, e.g. MIDI or
Audio going in. This is an optimization – active many years – to avoid wasting
CPU cycles on plugins processing „nothing“. Such a plugin can’t trigger itself.
Markus
On Feb 23, 2024, at 09:20, support (One Red Dog Media) via Coreaudio-api
<[email protected]<mailto:[email protected]>> wrote:
Hi
I’m experiencing a bug with my AUv3 when hosted within Logic Pro (both Mac and
iPadOS). When a project containing my plugin is re-opened, the plugin’s own
virtual keyboard does not trigger the synth’s audio engine. It appears silent
or suspended. When the Logic transport is started, or MIDI is sent from an
external instrument, or MIDI is sent from Logic’s own virtual keyboard, my
plugin “wakes up” and the audio engine is behaving as expected. During the
suspended phase, the plugin’s UI behaves as expected.I’ve managed to get the
Xcode debugger connected to my AUv3 and observed that the render callback is
not called until the transport starts or MIDI received.
This is the last major bug, so any help will be appreciated in understand what
the fix is.
Thanks
peter
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/mfritze%40apple.com
This email sent to [email protected]<mailto:[email protected]>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list
([email protected]<mailto:[email protected]>)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/rknelson%40apple.com
This email sent to [email protected]<mailto:[email protected]>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com
This email sent to [email protected]