Eliminating jitter

I have implemented a simple client/server multiplayer “move-a-dot” game.
I currently do all the simulation server side. Client sends keypressed, keyreleased messages. Server updates simulation 30 times a second and sends out a packet
to each client containing absolute positions of all clients.

Now when I run this I get horrible jitter and it isn’t due to lag (on my own machine I can easily increase the tickrate to 60hz without problems).
First I thought of simply doing some delayed client side interpolation, but it doesn’t work that well since player movement on the client is initiated when the server
starts spewing position packets back. (its a glorified telnet client :slight_smile: )

I am using TCP (naggle off) but can’t see how changing to UDP would help since my client is capable of receiving 30 packets a second quite well, at least running locally.
After searching this excellent forum for similar issues my question is this:

Is there an easier way to get rid of this problem other than implementing full client side prediction/dead reckoning?.

Now I don’t mind learning to implement the aforementioned techniques but it was my impression that they were only seriously needed for game state updates occurring less than 20 times a second?.
In other words am I missing some obvious way of eliminating the jitter?

Sounds like you probably just need to dampen the values you’re receiving from the server. TCP shouldn’t be losing very many packets and they should be going in a set order, so anything related to that likely isn’t your problem. I’m guessing that they’re simply not being received completely regularly or something like that, and therefore you don’t want to updating to exactly match what got sent, instead you want to move towards what got sent. Does that make sense?

Yes it makes sense. I’ve already tried that, was what I meant by “delayed client side interpolation”, but the delay is a major problem with that technique, since the characters only starts moving when the first position update arrive from the server. Besides it still looked jumpy (what I did in more detail was constantly interpolating against the received absolute position over 33 ms (since my server tickrate/heartbeat is 30hz)).

Funny thing is, I tried sending the heading instead of absolute coordinates, I still get the jitter, even though the server only updates the clients heading vector, eg the movement is calculated using the usual scale by frame delta time on the client. I then tried running Kevin Glass’s space invaders game (http://www.cokeandcode.com/spaceinvaders/si101.jnlp) and noticed that the movement of the player character is also jerky as hell (and that example is using frame delta based movement too).

Using delta frames works very well in OpenGL so i’m a bit puzzled why its so bad for java2d graphics. I think its a combination of regularly occurring FPS bursts as well as the cast of float values to int (actually displaying the graphics with java2d) which causes this.
I’d did a little more browsing on Kevin’s excellent site and found this tilemap collision example demo: http://www.cokeandcode.com/collision/tilemaps/tilemap.jar

This demo exhibits much smoother movement albeit still a little jerky but a vast improvement over the 2d space invaders implementation. Curious as to why I looked at the code and found this snippet:

Which imho is a quite brilliant implementation of frame delta smoothing. Far as i understand I would get the same result by keeping the last 5 frametimes in a buffer and averaging them, so its a clever way of averaging the frame delta over time.
Now I’d love to apply this technique to my game, but since i’m running the movement on the server and sending back absolute coordinates I don’t use the delta for anything client side.

What I could do is apply this to my server movement, but I thought using a Timer/Timertask at 33 ms intervals (1000/30) would guarantee a somewhat steady 30hz tickrate. Can anyone confirm this? and do you guys think it would be good idea to calculate delta time (and smooth the values) on the server? (even though its supposed to run at a fixed rate).

Guess I kindda solved it myself. Seems one can’t do completely smooth movement in a window with java2d (at least using the linux implementation).
I’ve added a “local player” and I am applying frame delta smoothing and interpolation to both. Its still smooth movement for a sec, then a little jitter etc. Very bursty.
At least my networking code works since the remote player moves as smooth as the local one now.

if anyone can think of anything I haven’t tried for doing smooth movement in a window with java2d, im all ears. Guess I should really be asking that question in the java2d forum though.

Java2D shouldn’t be particularly jittery, although if you’re doing double buffering wrong or dealing with repaint() wrong that can cause jitters.

I more or less disagree, pure Java2D is a way too hard close to impossible for a smooth windowed animations. BufferStrategy drawing is a some help but not even close to a year 2009 quality.

See this topic about a game loop it has some game loop approaches and none satisfies me. But maybe I am spoiled by windowed Direct3D vsync app with cpu affinity masked to a single core.

I don’t have an answer how Java should handle a multicore timer issue, probably use a customized timer.dll.

double-post

hmm, I’m going to have to try that code out. Looks interesting to see if it actually smooths anything.

The whole jitter problem seems to be a timer issue on dual/quad cores more than anything. Java doesn’t know when a part of the processor has shut it self down for a few milliseconds.

As for the server question, seems like the client is where you want to do the smoothing but not sure.

I’ll post back in a couple of days when I try that smoothing code. Not sure if it is going to effect much because as long as I set my timer to something not divisible by 10, it seems to become more or less accurate.