this is mostly a repeat of what rog. said. i'm a slow thinker, but i'll subject y'all to what i was thinking anyway....
> the ordering problem is misleading: you need timely response for > interactive applications; it's a reasonably straightforward application > of real-time programming. (by the way, if you're passing low-level > things like that across lossy wireless networks, you're possibly > not addressing the most relevant problem first.) the effects you're trying > to synchronise > are typically changes to data structures inside a program (including effects > on the display), > so that's where the synchronisation and interlocking should be. > that's fine. but what acme does doesn't work. i'm sure everyone here has typed something in one acme window and had it appear in another. almost always, the keyboard and mouse are connected to the "same" hardware. and it's people realtime not real realtime. so it seems to me that this problem should not exist. and it seems to me that the problem is exactly that the kbd and mouse are on seperate channels. what i proposed will work when all the input devices are connected to the "same" hardware. a combo usb kb/mouse or a standard pc kb and mouse would qualify. applications like acme would not need any modification, though libraries would. what do you propose? > it's not as though the underlying devices > weren't separate streams; they are, and it makes sense for the view > of them to reflect that. it also makes it easier to add new input > devices. i see already you've got 'k' and 'm', with surprisingly different > content, but what about that fingerprint thingy to unlock the cheats? or > perhaps more to the point the > 'w' for wheel and 'p' for pedals? you'll never finish. you have this problem regardless of implementation strategy. but this is mostly argued in the moot court as most such devices act like either a keyboard or a mouse. - erik