(This was also posted on forums.java.sun.com)
Since the beginning of time, people have been trying to come up with a timer in pure Java. The problem has been that System.currentTimeMillis() is not accurate across systems. On Windows 2000 for example, it’s 10 ms. Of course, 10ms is still enough to time up to 100 frames per second. The problem of course, is that you never know where you are in that 10 ms resulting in inconsistant drawing rates of 22 ms, 18 ms, 27 mx, etc. This makes the animation appear to “jump” around on the screen. This has meant that System.currentTimeMillis() has been useless for timing framerates. That is, until now.
As I was thinking about the problem, I realized that there’s no need to increase the resolution, just sync based on the resolution. Older DOS based games used to latch onto every other screen refresh for timing. Serial ports wait for the control signal to change. These methods weren’t to make something faster, they were to synchronize timing. Then I realized, what I really needed was to latch onto the leading edge of the digital wave that the timer produced. i.e. If I always waited until exactly after the timer was updated, I should have exactly the resolution of the timer until the next tick.
Well, the idea was great in theory. In practice, it was about as successful as mixing coordinate systems. You see, this method would allow me to write code that would hit 50 fps right on the dot (no small acheivement for such a low-res timer). It seems that should work great, right? Wrong. The problem was that the VSync was running at 60Hz. This gave me a remainder (10 fps) that couldn’t be evenly split between frames. Had I chosen 30 fps, I would have better success due to the nice even split (2 VSyncs per frame). The problem (again) is that the resolution of the Windows 2000 timer (10ms) doesn’t leave much room for distribution of the extra timing (100/30 = 10/3 = 3.333333).
The best solution for this I’ve managed to produce is to seperate the timing of the frame from the logic. If the logic runs on it’s own timer, it should appear to be smooth since every frame of drawing is being taken advantage of. And indeed, this does appear to be much smoother. However, there is still a slight jerkiness visible. In a real game this would probably disappear, but it’s always best to try for perfection.
The entire package with source, javadocs, a JAR lib and an example file is available here:
http://java.dnsalias.com/timer/timer-1.0.zip
Let me know what you think. Enjoy!