Questions about port speed and game bandwidth required

Just wanted to make sure my maths were right. So I just finished watching this video Google I/O 2012 - GRITS: PvP Gaming with HTML5. In the talk they said that in their 8 player multiplayer game they had a server bandwidth of 1.2 MBytes per second.

The problem is that in the multiplayer game I’m working on, an 8 player match will just use up 64KBytes per second server bandwidth. The gameplay for both games looks similar, they’re both top down shooters with box2d and many game entities. I just have a hard time believing that my game is almost 20 times more bandwidth efficient than something a google engineer could come up with. I’m guessing somewhere I got my math wrong.

Just a heads up on what I’m doing, each player gets 30 x 256Byte packets per second filled with as much relevant info for the user as possible. On average it ends up being about 3KBytes/second depending on the action on screen but at worst I know it’s capped at about 8KBytes per second per user. I was able to confirmed this using the DDMS Network statistics for android.

Second math question :confused: was that I’m currently renting a linode server for $20 a month for testing. The website says I get 250 Mbit Network Out. I’m sure that’s just a theoretical value but lets just assume its the real deal.

250Mbit = 31.25 MByte = 32000 KBytes.

32000KBytes / 8KBytes per player = 4000 players.

It doesn’t seem right that a $20 server can hold 4000 multiplayer users of bandwidth. At this point the bandwidth is no where near the bottleneck but for multiplayer games I always thought that this was the case. Did cloud server providers get so advanced that this is no longer an issue for games or is my math completely messed up? Thanks for any help! This is will be my first multiplayer game released so I’m trying to predict what kind of server I’ll need on the backend to run it.

I am pretty sure most VPS (atleast the ones I have looked at) atleast provide 100/100 Mbit internet.

that “4000” players is theoretical in a perfect scenario, I am sure your ISP tells you you will get a certain amount, but really only get half of it. e.g. in my case I am suppose to get around 20Mbit down, but I only get between 5-7 MBit down. Also, 250Mbit may see like there super advanced, but more and more people are starting to get 1000Mbit internet (I believe the University I go to, their provider also supplies 10gbit internet, rumors that mine have it, but I have only heard people getting around 800 Mbit).

Also for that 4000, you are just taking in account of the network speed required to maintain 4000 players, are you sure your server can handle 4000 players and send 120000 (30 * 4000) packets and update the game at the same time?

TL;DR: You also need to take in account of the hardware required to have 4000 players at one time, which will most likely be the issue.

[quote]Also for that 4000, you are just taking in account of the network speed required to maintain 4000 players, are you sure your server can handle 4000 players and send 120000 (30 * 4000) packets and update the game at the same time?
[/quote]
Ya absolutely. It’s just that I had always thought that the bottleneck for how many clients you can get for a multiplayer server was due to port speed available but I guess things have changed in the last few years. And that’s just with 250Mbits/s, I couldn’t imagine having 10Gbit or hell even 1Gbit available. Right now the bottle neck is on the cpu. Thanks for your input.

Ohh ya I thought I had put this in the Networking and Multiplayer subforum. Opps :confused:

So you upload 3-8KB per second per player. Surely the other 7 players have to be notified of these events.

8 players each upload 8KB/s (server receives 64KB/s)
8 players each download (8-1)*8KB/s (server sends 448KB/sec)

Now add misc. other packets initiated by the server.

This is a lot closer to the 1.2MB/s that the other game uses.

sorry I should have been more specific. Each player uploads about 1KB/sec of input data to the server. The server then sends 8KB/sec to each player. That 8 KB is filled with only what’s a priority for each player. I guess what I’m trying to say is that the bandwidth scales linearly with the amount of players. It’ll always be just 8KB/sec per player upload from the server.

So for 8 players on the instance server it’ll be.
8 players each upload 1KB/s (server receives 8KB/s)
8 players each download 8KB/s (server sends 64KB/sec)

It’s a tradeoff between how many networked objects are a priority to each player and how much bandwidth I want to use. As the number of objects go up the more objects are fighting for room in those 8KB.

This is simply not possible, as more players by definition means the amount of communication goes up, and eventually you simply cannot fit the absolutely minimum amount of data to keep the game playable, in that 8KB.

I agree but the idea is get the amount of communication to increase linearly as more players join rather than exponentially. I think we’re saying the same thing. There is only so many networked objects you can have on the screen and still keep it at under 8KB/sec while keeping it playable. 8KB/sec is more than enough for 8 players interacting with each other but then again my game is a pretty simple 2d space shooter. Something like battlefield 4 or titanfall will have different requirements.