>We know that the ideal way of doing this is having both the sequencer >and the softsynth access to the same exact clock for reference, then >having the audio app a predefined "delay in time" consisting of the >fragment size. After that it's a simple matter of taking the current >time before mixing a block, and mix internally in smaller blocks >while retrieving the event.
this is how you handle incoming MIDI data,yes. but outgoing data is not a "simple matter" if you want to pre-queue the data and/or have a resolution that matches the MIDI data rate. > I see this as something JACK could do >really well (take the timing before mixing a block and give it to the >processes, so even if one has to wait for the other according to the >graph, the sync with midi would be fine. I guess basically this alone >may justify giving jack midi superpowers, since we know how important >a good midi sync is. yes, but "MIDI sync" isn't defined. as i've explained before, there are two kinds: MIDI clock: a low resolution "tick" MIDI timecode: a low resolution positional reference different kinds of devices use one or the other these, but rarely both. most drum machines and analog-ish sequencers use MIDI clock. most HDR systems and other digital audio systems use MTC. syncing to these two things is a very different task depending on which one you pick. MIDI clock tells you how fast you are moving, but doesn't necessarily contain any positional information. Of course, many systems may also generate MIDI song position data along with MIDI clock, just to confuse matters :) --p