On Tuesday 05 March 2002 16.10, Paul Davis wrote: [...] > >Yeah... The stream frame rate is the clock. > > the window/macos worlds have been pretty unhappy with this > arrangement, which i am convinced is part of the reason for the > emergence of these newer interfaces which accept pre-queued > data. timing MIDI with the audio clock just doesn't work well > enough. even at the best typical resolution (1.3msec, or 64 frames > at 48kHz), the MIDI sloppiness is something i've seen lots of > complaints about on vst-plugins, for example.
Not surprising. Even 1.3 ms is worse than the old Win16 sequencers, and you need a fast machine, a perfect configuration and luck to get that kind of buffer size to work without (too many) drop-outs... > granted, some > interfaces can go lower than this, but its not clear its a good > idea for the overall system. It's not even clear it's *possible*, even if the interface can do it in theory. (That may require a real OS! ;-) > and besides, what we want, ideally, is > a clock running at the MIDI rate. Yeah - and then all we need is to read a common high resolution timer (TSC) from whithin the MIDI and audio "engine loops", to get *perfect* timing, down to the resolution of the respective interfaces. :-) (Of course, unless we use RTL or RTAI for the timestamping, we'd still need some of those dreaded PLLs - but only to deal with the occasional latency peak into the hundreds of �s range. It's probably sufficient to assume that the average timestamp error is zero.) //David .- M A I A -------------------------------------------------. | Multimedia Application Integration Architecture | | A Free/Open Source Plugin API for Professional Multimedia | `----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter | `-------------------------------------> http://olofson.net -'
