Tim Hockin wrote: >> by definition, time isn't flowing when the transport is >> stopped. a delay in stationary time can only be a zero > >umm, I know that a requirement for me is to be able to stop the sequencer, >and still play MIDI and have delay lines etc still delay. Are you saying >that this can't be done in your model? In my opinion just because transport >has stopped does NOT mean time has stopped. On the contrary, ticks are >still happening, the tempo is still in effect. Just because the SEQUENCER >stopped sending events doesn't mean the rest of the studio did.
to allow for everything you mention here, you need to start counting a different time -- you've stopped the transport, so transport time isn't flowing any more. at least that's what i do, calling it 'virtual time'. the thing is that you need to keep time well-defined and controllable at one point, for the whole network. if you don't, things like synchronization and transport control are tough to get right. >Standing proposal: > Host processes blocks of 'n' samples. Events are delivered with a > timestamp that says 'actuate this event at this time within this buffer'. > This is exactly what user-supplied automation is, totally randomly timed > events. Some plugins need to sync to tempo or music-milestones. They > indicate this need and receive tempo, meter, ticks events. It is > responsible for tracking changes. drop the tick events and it starts to sound reasonable. buffer-relative timestamps are a definite no. reason: you need to be able to schedule events far ahead (streaming, prequeueing). calculating a buffer- relative time in this case requires either knowing all future buffer sizes or updating these events at every cycle. the latter is too awkward, and the former enforces a guarantee that severely limits the system's synchronization capabilities. tim