Hey Guys, I need some help.
OK, basically, my update/sync code looks like this:
//This gets run every 33 ms.
public void run() {
frame = (frame + 1) % 600;
if (frame % 60 == 0) {
frameCount = “” + (framesPerUpdate / 2);
framesPerUpdate = 0;
}
if (frame % 4 == 0) {
try {
if (owner instanceof JApplet)
dout.writeByte(UPDATE_CLIENT_TCP);
else
dout.writeByte(UPDATE_CLIENT);
} catch (IOException ie) {
System.out.println("Connection closed!");
System.exit(1);
}
}
if (avatarSelected != null)
currentMap.updateVelocity(keyState, avatarSelected); //for caching
currentMap.update(Theme.UPDATE_FREQUENCY);
}
so every 99 ms, I ask the server for an update. When the server gets a request for an update, it sends all of the locations, states, and extra avatar information to the client. Right now it sends about 100 bytes per avatar, which is obviously ridiculus, but I can use dirty bits and stuff to fix that.
The problem is this: When the user htis a key, say “go right”, the client instantly starts going right. The server has to wait for the command, then starts going right. Then when the server updates the client, the movement that the client did before the server knew that the player was moving gets taken away.
The end result is this jerkiness when you start to move or do an action.
I could try to emulate the server latency in the client somehow by adding delays or something, but that would be kinda tough.
I could also let the server handle all of the state changes,but that might have some weird consequences.
I figured that everyone who does network programming must have the same problem. What do the experts do to fix this situation?