Depends on the game. If it’s a shooting game or another twitch-based game, I’d say 60Hz. If it’s just a strategy game, set it to 20. No matter which update speed you choose, using interpolation makes it lot smoother.
I don’t think 60Hz is necessary. you want 60fps or more, but you don’t need 60hz update packets sent
I think most people would agree CounterStrike is a twitch based game and I think they only did approximately 20 broadcast/pulse/packets. But they do some fancy delay/interpolation for lag and packet loss compensation
source: https://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking
Even if you can technically send / receive 60Hz over the internet, you shouldn’t.
In my experience, at those rates, it triggers your (well, my) ISPs abuse-protection algorithm, and your entire connection will be severely throttled, with latency consistently over 1000ms, even after the game is closed and you’re pinging some ‘neutral’ server like google.com.
AS Cas said, 20Hz networking is the maximum update rate you should consider.
I haven’t seen your code, but judging from your other thread I’m going to assume you are still updating every player on each broadcast? Serialization is very costly and isn’t exactly something you should be doing on a List of objects twice a second. Something you could consider is to have the client maintain a list of local players and have 2 packets for adding and removing players from this list. For example:
public class RegisterLocalPlayerPacket {
protected Player player;
protected int identifier;
}
public class DeregisterLocalPlayerPacket {
protected int identifier;
}
Then when you want to update something such as a players position you could have yet another packet!
public class UpdatePlayerPositionPacket {
protected int identifier;
protected float relativeX, relativeY;
}
Note: identifier is just a unique ID for an online player.
However, this means on the server end you need to update players separately from one another. What I’d do is create event listeners (I’m not exactly sure how your server works so this may not apply!) for players entering a new area. Upon entering that region I’d broadcast a RegisterLocalPlayerPacket to everyone in that area. When the player leaves the area you then just broadcast the opposite packet to everyone in that area.
Now you have the task of updating player positions. That’s easy, you just take the players current position and subtract their last position. You can now do:
@saifix
I would really prefer not having such lists on the clients, it doesn’t suit me very well that the client handles anything on its own.
The position system doesn’t float, a player can jump from 1,1 to 16,15 at any time instantly, and the opposite as well
However, your answer was in-depth, and i’m going to look more into your suggestions. Thank you!
I understand your concerns, but, the client isn’t handling anything on its own! The idea is that the position is kept server-side, here’s pseudo-code for it:
Wow, I seem to be having some pretty severe misconceptions about network coding… I’ve gotta say that I’ve never heard of an ISP ever do that though. I’m pretty sure when I’m seeding my, uhm, Linux kernels at 1MB/sec I’m sending more than 60 packets per second to the same IP. Seems more like a “problem” with your ISP than a universal problem. =P Still, sending updates that often might not be necessary. I’d still say that 20Hz is too slow for a twitch-based game. At 20Hz, you could get a maximum additional input delay of 50ms*2 (average 50 I guess). In total:
Keyboard/mouse delay (they usually only poll at 100Hz): 0-10ms.
20Hz send: 0-50ms
Network delay: 0-50ms.
Server processing: Say 0-5ms.
Server broadcasting at 20Hz: 0-50ms.
Rendering delay: 16.6 per GPU (usually 16.6*3 ms more due to the pipeline)
Monitor delay: 2-5ms.
Best case: Single GPU with a fast monitor on LAN and everything timed perfectly: 0+0+0+0+0+16.6+2 = ~20ms
Worst case: 10+50+50+5+50+16.6+5=186,6ms.
The worst case could be improved by 60ms to 126.6ms with a broadcast rate of 60Hz instead of 20Hz…
About your worst-case: the input-delay doesn’t have to go round-trip, so “Server broadcasting at 20Hz: 0-50ms” should not be added to input-delay. It’s fire-and-forget. The client immediately takes action, it doesn’t wait for the server to ACK the action (in FPS/realtime games).
The client isn’t allowed to damage other players in an shooting game. If they did, almost every fire-fight would end in both players killing each other which should be impossible with infinite speed ray bullets since both will be able to shoot a few shots at each other before the message that they’ve been killed reaches them. That’s actually what a game called Wolfteam did. It was a horrible lagfest: You kill lagger. Someone else kills you. You respawn 5 seconds later. Lagger kills you instantly since his damage was so delayed.
In the CoD games this is impossible because you cannot damage an enemy if you’re dead (with bullets, rockets and grenades obviously aren’t affected). It’s very easy to notice how they ensure that since hit-markers (a confirmation of your hit) appears after a very noticeable delay on laggy servers. In other words, a round trip is required for such a case. During that time, the enemy can run around and kill one or two of your friends even though he’s already “dead”. There’s also the shooting around corners problem which will be severely enhanced by having delay.
Very true points, but you’re addressing a different problem. What you’re describing seems like badly written server logic.
If the server deems player X killed by you, then the serverside logic shouldn’t send any events (movement, bullets fired, etc) to any clients - it shouldn’t process it at all. That way, you cannot be killed by a lagged bullet, after you respawn.
Anyway… clientside ‘input latency’ is visually not effected by network latency at all, but from the server’s perspective, the ‘client input latency’ is increased by one-way latency (not round-trip). The ‘gameplay latency’ (when you see the (persistent) effect of what the server did) indeed involves round-trip latency, but that is not the same as ‘input latency’.
Yes, of course. That was obviously a server logic problem. I just spun off on a tangent for a bit. =S
I know that they aren’t exactly the same, but I wanted to show you how big part that additional latency would be for the client. Of course a good game won’t wait for the server to acknowledge things like looking around with the mouse or walking since that would be ridiculous, so “gameplay latency” is what is mostly affected by a low network send frequency. With proper interpolation you could make even 2Hz look smooth and fluent. Well, until you start actually interacting with the world and people start shooting you after you’ve run around a corner or killing you before they run out from a corner.
The current network model used by FPS games strongly promote aggressive game play since the one running around a corner coming out of cover has a considerable time advantage before his position is updated there. I should know, I am (or at least was) a master of abusing this in CoD.
This presentation give really good overview of network latency and how to hide it. There is lots of good diagrams where is the lag and what ramifications it have in gameplay. There is also powerpoint version some where at http://www.bungie.net/inside/publications.aspx
The devil is in the detail. The articles in question largely make it sound like you just do bish bash bosh and Bob’s your uncle, instant networked game. It turns out that it’s not really all that easy. Riven and I spent over two months just getting a couple of guys to run around on a gridbased map shooting at each other.