I'm making an online tile-based 2D RPG(top down, 32x32px). Players can move to any neighbouring tile using arrow keys. I'm finding it a little difficult to understand interpolation.
I understand in a game where direction can be abruptly changed, concepts such as dead reckoning would be useless.
Client prediction is working as normal, however, when I receive updates from the server about other clients positions with different lantecy, I am struggling to grasp this concept.
For example, I am player A.
Currently, when player B has moved, I receive their new position and a timestamp from the server. On a LAN connection, this seems to work as intended, and player B moves on my screen immediately, with correct interpolation. I calculate the time passed from when the server sent me the data, and when I have received it, and interpolate the difference every frame(16ms, 60fps).
However, say if I have 200ms RTT, and player B has a 400ms RTT, and they move 1 tile every 250ms, I will receive the data 300ms later, and so I only see player B teleport to the new tile.
Do I have to add some additional time from their latency onto the timestamp so when I receive it, the timestamps match and the animation starts smoothly?
Am I over thinking this?
Thanks.
Edit: I forgot to mention the server sends updates every 50ms.