Questions about port speed and game bandwidth required

Just wanted to make sure my maths were right. So I just finished watching this video Google I/O 2012 - GRITS: PvP Gaming with HTML5. In the talk they said that in their 8 player multiplayer game they had a server bandwidth of 1.2 MBytes per second.

The problem is that in the multiplayer game I’m working on, an 8 player match will just use up 64KBytes per second server bandwidth. The gameplay for both games looks similar, they’re both top down shooters with box2d and many game entities. I just have a hard time believing that my game is almost 20 times more bandwidth efficient than something a google engineer could come up with. I’m guessing somewhere I got my math wrong.

Just a heads up on what I’m doing, each player gets 30 x 256Byte packets per second filled with as much relevant info for the user as possible. On average it ends up being about 3KBytes/second depending on the action on screen but at worst I know it’s capped at about 8KBytes per second per user. I was able to confirmed this using the DDMS Network statistics for android.

Second math question :confused: was that I’m currently renting a linode server for $20 a month for testing. The website says I get 250 Mbit Network Out. I’m sure that’s just a theoretical value but lets just assume its the real deal.

250Mbit = 31.25 MByte = 32000 KBytes.

32000KBytes / 8KBytes per player = 4000 players.

It doesn’t seem right that a $20 server can hold 4000 multiplayer users of bandwidth. At this point the bandwidth is no where near the bottleneck but for multiplayer games I always thought that this was the case. Did cloud server providers get so advanced that this is no longer an issue for games or is my math completely messed up? Thanks for any help! This is will be my first multiplayer game released so I’m trying to predict what kind of server I’ll need on the backend to run it.