Hi Bill,

client sends time-request packet to server
server stamps and responds
client takes the last send time, current time, figures out the *average*
latency and adjusts the time to predict where the server is right now
the client seems to always be a little bit off though and I think it's
because the sending and receiving speed / packet delays are different.
This would mean the client is /always/ over or under compensating.
Use your procedure to get initial time difference between server time and local time as well as half latency. The next time you get the response, try to predict what the server time should be using information from your previous response. Obviously it will be off, but the direction will tell you which way you should shift your prediction.

lt - local time; st - server time; lt1 - request time; lt2 - response received time.
1) hl = (lt2 - lt1)/2
   dif = st - (lt1 + hl)

2) predicted_st = lt1 + dif + hl
   hl = hl + (st - predicted_st)

... repeat

You have two variables dif and hl that you estimate from your times in my example I assume that dif was done properly the first time around, while in fact you need to keep track of how much your prediction is off and see if you need to redo the process.

I did not try this, just my 2c. Let us know how you decided to solve the problem.

Blue skies,
Alexander
_______________________________________________
ENet-discuss mailing list
[email protected]
http://lists.cubik.org/mailman/listinfo/enet-discuss

Reply via email to