A 'frame' in the orangebox engine does this, server side:
while (sleep(1ms)) { while (ShouldRunATick()) { RunATick() } }
You can refer to the disassembly or even the old HL2 leak if you don't
believe me. Everyone who claims otherwise is just wrong.
So the only advantage FPS gives is how often the idle engine wakes up to
check for the next tick. It's worth noting that sleep(1ms) could sleep
1ms, 5ms, or even 15ms depending on the OS, kernel timer method, whether
high resolution sleeps are supported, etc..
So a higher FPS can give greater accuracy in tick *timing*, as each tick
would be starting closer to its ideal (at 66tick, a tick would happen
exactly every 15ms). If your FPS is 100, this means wakeups can happen
+/- 10ms, which could be almost an entire tick late. You'd still get
roughly 66tick, but with lower tick accuracy, some might argue you get
worse hit registration as you're less in sync with clients. However,
most people play at pings of 50ms or higher. Is 9ms really that significant?
You can pull up net_graph 4 and watch the 'var: ' value, which I'm
pretty sure is the variation in wakeup times in milliseconds, hence the
max lag in milliseconds that could be alleviated by higher FPS.
TL;DR: If you're getting solid 66 updates/second and your var is under
10, nobody with less than 60ms ping should be complaining about "hit
registration".
- Neph
On 07/15/2011 11:04 PM, William Balkcom wrote:
With valve now limiting FPS in orangebox to 500FPS, is it recommended
just to run the server at the default now? Does this same concept
apply for the HL1 engine and non orangebox games? Are you truly
gaining anything by running at 500FPS?
_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds_linux