> I don't fully understand the discussion here. > Initially people claimed that HZ at 8000 would be a problem,
Well, my experience indicates that it _is_ a problem, at least when using disks at piixide (or pciide). > which for me seems a bit backwards. Me too. I was - am - rather puzzled by it. > I can only see two real problems with a high clock frequency: > 1. The overhead of just dealing with clock interrupts increase. Yes. This is a potential issue. When I set HZ=8000 it was one of the first things I looked at. The extra interrupt handling load is measurable, but small enough that it was an acceptable price for making the application work. IIRC it was on the order of 1-2% of one core. > 2. If there are things that just give time in ticks, then ticks > become very short. And if the assumption is that just one tick is > enough, such an assumption can fail. Yes. But I have trouble seeing how such a failure could lead to a delay that is (approximately) linear in HZ. The only thing I've been able to think of is that some delay somehow is having mstohz(), or moral equivalent, applied to it twice. That is, mstohz is applied, and then some other code assumes the result is ms rather than ticks and applies mstohz to it again. At HZ=100 this means you get a tenth the delay you think you're getting. At HZ=1000 it's what it should be. At HZ=8000 it's eight times what it should be. Based on that theory, I would suspect someone has written a delay number that should be three seconds and is actually getting .3 seconds at HZ=100, going up to 6s at HZ=2000, 12s at 4000, and 24s at 8000, but about 2s of other stuff happens in parallel with it, so I see the delay as 2s shorter than it is. I could easily see myself thinking "weird, I have to ask for 3000ms to get a delay long enough for this to work, doesn't _look_ like three seconds, must investigate sometime" and never getting around to actually investigating. My principal reason for thinking this is so is that when I disabled USB, the delay (at HZ=8000) went from 22s to about 24s - but the absolute time at which the delay ended, the number printed in brakcets, stayed (approximately) constant at about 25.3. This matches well with the theory that a delay of about 24s starts early and runs in parallel with the rest of autoconf, and USB takes about two seconds, so I see 22s of delay. But then when I disable USB, it doesn't hide 2s of that 24s, so I see the whole thing. (I'd suspect 25s of delay except that the numbers in brackets seem to start at 1. Also, 24s matches better with something getting multiplied by 80 than 25 does.) /~\ The ASCII Mouse \ / Ribbon Campaign X Against HTML mo...@rodents-montreal.org / \ Email! 7D C8 61 52 5D E7 2D 39 4E F1 31 3E E8 B3 27 4B