I agree that a better explanation from Valve would be good to squash some of the speculation on what FPS really means (the docs I've seen talk about tickrate, but not FPS). Maybe there's an official one out there, and we just need to find it.

Gary, I'm not sure that you're right about seemingly small amounts of jitter never representing a problem. Imagine a scenario in which a server runs at 10fps and a tickrate of 5, with this timeline:

Time marker - Network frame event and length of event - Tickrate event and length of event
---
0ms - Network frame (10ms) - Tick (50ms)
100ms - Network frame (10ms)
200ms - Network frame (10ms) - Tick (50ms)
300ms - Network frame (10ms)
400ms - Network frame (10ms) - Tick (50ms)
500ms - Network frame (10ms)
600ms - Network frame (10ms) - Tick (50ms)
700ms - Network frame (10ms)
800ms - Network frame (10ms) - Tick (50ms)
900ms - Network frame (10ms)

In that run, with all frames and ticks fitting nicely together, the server would have a smooth realized 10fps and 5 tick. Great! But, what if only one of the ticks takes much longer than expected?

0ms - Network frame (10ms) - Tick (50ms)
100ms - Network frame (10ms)
200ms - Network frame (10ms) - Tick (390ms)
300ms - It's still working on the last tick
400ms - It's still working on the last tick
500ms - It's still working on the last tick
600ms - Network frame (10ms) - Tick (50ms)
700ms - Network frame (10ms)
800ms - Network frame (10ms) - Tick (50ms)
900ms - Network frame (10ms)

The realized FPS here in this case would be 7, and the realized tickrate would be 4. This means that the FPS didn't dip all that much and still exceeds the tickrate, and yet the client would have seen a (very noticeable, at this resolution) glitch in gameplay.

Scale this up to higher FPS and tickrate values, and it's quite possible that a dip from 150 to 100, or 90 to 66, could represent a problem. Does it always, and is it always noticeable? No, I wouldn't say that. But, realized FPS is still the best measure of purely server-side performance that we currently have at our disposal.

I would like to see a realized tickrate number in addition to, or instead of, FPS. Locking the FPS rate to the tickrate (as L4D/L4D2 servers do, by default) also effectively gives us this, but presumably there is a benefit to having a decoupled higher FPS, such as by splitting up some of the network processing work into smaller chunks so that ticks take less time.

(In the real world, what could cause a tick to take so long? Commonly, a misbehaved plugin or long disk write. The latter can be caused by very heavy background disk access when the server is flushing out a log.)

-John

On 11/15/2010 3:37 AM, Björn Rohlén wrote:
Instead of hiding the server_fps, it would be better to explain it in
detail.

-TheG

On Mon, Nov 15, 2010 at 9:51 AM, Gary Stanley<g...@velocity-servers.net>wrote:

I guess the new sales pitches are that when a server has FPS jitter (from
say, 100 to 150 or 66 to 90) that is bad and causes all kinds of issues.

Can valve PLEASE PLEASE PLEASE remove FPS from rcon stats or do something
to prevent it's behavior from being altered? Or lock it at 1:1 so it scales
with the tickrate?


_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives,
please visit:
http://list.valvesoftware.com/mailman/listinfo/hlds_linux

_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds_linux


_______________________________________________
To unsubscribe, edit your list preferences, or view the list archives, please 
visit:
http://list.valvesoftware.com/mailman/listinfo/hlds_linux

Reply via email to