2
votes

I'm making an online tile-based 2D RPG(top down, 32x32px). Players can move to any neighbouring tile using arrow keys. I'm finding it a little difficult to understand interpolation.

I understand in a game where direction can be abruptly changed, concepts such as dead reckoning would be useless.

Client prediction is working as normal, however, when I receive updates from the server about other clients positions with different lantecy, I am struggling to grasp this concept.

For example, I am player A.

Currently, when player B has moved, I receive their new position and a timestamp from the server. On a LAN connection, this seems to work as intended, and player B moves on my screen immediately, with correct interpolation. I calculate the time passed from when the server sent me the data, and when I have received it, and interpolate the difference every frame(16ms, 60fps).

However, say if I have 200ms RTT, and player B has a 400ms RTT, and they move 1 tile every 250ms, I will receive the data 300ms later, and so I only see player B teleport to the new tile.

Do I have to add some additional time from their latency onto the timestamp so when I receive it, the timestamps match and the animation starts smoothly?

Am I over thinking this?

Thanks.

Edit: I forgot to mention the server sends updates every 50ms.

1

1 Answers

0
votes

If you're authoring a simulation where dead-reckoning has no value, it'd be best to have the server bundle the game state, or better yet the game state delta, as a single unit (although not necessarily a single packet, if your state is large).

From your game description, I think you are not going to be able to allow clients to update independently of the server-authoritative game state -- thus client round trip time is immaterial. Even your own client's latency and lag are immaterial -- the server update rate is the value you want to interpolate over.

In the scenario where some clients have widely variable latency or significant loss, and many packets arrive much later than expected, you could accelerate the server state replay locally on that client, to "catch up", but the client's input will still have to be applied to the current game state on the server (which the client is seeing delayed -- in some cases badly).

You can't really hide latency in a game like the one you're describing, where no error in the presented game state is allowable -- you can only "pretend" the server game state was changing in a more analog fashion, by interpolating the server game state between two subsequent data points. Other clients' latency is always hidden to everyone but the client itself -- even the server shouldn't really care.