Apart from @theagentd’s highly astute observations about framerate and LWJGL examples, DWC with OpenGL is entirely happy to occasionally not bother rendering the odd frame for you, causing what appears to be random jitter.
Bottom line: on Windows, you need DWC turned off if you’re on Vista or 7, or to be running in fullscreen; you need vsync on; and you need to actually sync your update loop to the display refresh rate. If any of these criteria are not met you’ll get little glitches.
Again, is this something that only affects Java and not Game Maker, then?
[quote]and you need to actually sync your update loop to the display refresh rate. If any of these criteria are not met you’ll get little glitches.
[/quote]
I think this is what the XNA forums link was referring to. Could you perhaps provide some example code or pseudo-code to explain how this is done?
Disabled “Aero Peek” and “Enable desktop composition”.
With Vsync the world is fine as always and without it those little stutters turn into tears now.
Not that it contradicts what you said Cas.
But you cannot expect people to have it disabled when it isn’t by default, which is obviously a problem here…
No, this affects all OpenGL games under DWC. DirectX games have better control over how they interact with DWC and appear not to suffer from this issue but I stand to be corrected.
Syncing to the display refresh rate is actually best achieved using vsync. It’s more accurate than even the nanotimer as though the display might report that it is doing 60hz it might actually be 59.97hz in reality. I use both vsync and a timer as a backup - sometimes vsync is reported as working when it actually isn’t. In addition Linux machines tend to not know what their refresh rate is or don’t have vsync capability at 60hz (eg. Compiz, that worthless heap of shit currently ruining Linux for everyone, runs at an entirely useless 50hz I’m told).
I see, so it may make sense that GM is unaffected.
[quote]Syncing to the display refresh rate is actually best achieved using vsync. It’s more accurate than even the nanotimer as though the display might report that it is doing 60hz it might actually be 59.97hz in reality. I use both vsync and a timer as a backup - sometimes vsync is reported as working when it actually isn’t. In addition Linux machines tend to not know what their refresh rate is or don’t have vsync capability at 60hz (eg. Compiz, that worthless heap of shit currently ruining Linux for everyone, runs at an entirely useless 50hz I’m told).
Cas
[/quote]
But there’s no way to do this in Java2D, right. Slick2D has a way, but I heard it only works in fullscreen mode; is that true? In any case turning on VSync in windowed doesn’t seem to improve the stuttering much (if at all).
Because Slick uses LWJGL, I’m sure Slick setVsync uses only the Display.setVSyncEnabled() of LWJGL.
Don’t know the technicality, but like Cas said, Display.Display.setVSyncEnabled() really syncs very good.
It is definitely very noticeable and different even in window mode.
If you’ve got DWC turned off and running in a window, and vsync’s on, the only thing left you can do is accurately sync to the hi-res timer. Unfortunately LWJGL’s Display.sync() method somehow manages to do this wrongly if you need it to be dead accurate. Try this:
public class Sync {
private long timeThen;
public Sync() {
timeThen = Sys.getTime() & 0x7FFFFFFFFFFFFFFFL;
}
public void sync(int frameRate) {
long timeNow = Sys.getTime() & 0x7FFFFFFFFFFFFFFFL;
if (timeNow < timeThen) {
// Account for clock wrapping
timeThen = timeNow;
}
long timeNext = (timeThen + Sys.getTimerResolution() / frameRate) & 0x7FFFFFFFFFFFFFFFL;
if (timeNext < timeNow) {
// Just forget about it this frame, clock's wrapped
timeThen = timeNow;
return;
}
do {
Thread.yield();
timeNow = Sys.getTime() & 0x7FFFFFFFFFFFFFFFL;
} while (timeNow < timeNext);
timeThen = timeNext;
}
}
Remove the Thread.yield() for ultra-precise timing.
One more thing: almost nobody cares about running smoothly when running in a window. It’s not how people play real games. Web toys, etc. - that’s what people play in windows, with accordingly lower expectations.
Yeah I mentioned that - I thinks its true.
Although back then, I played like Fallout 3 windowed and World of Warcraft and stuff - to see MSN or whatever
Tried Cero’s LWJGL example with and without princec’s Sync class; still stuttered. Note, however, that that example uses variable timesteps; I still haven’t tried a fixed timestep with interpolation.
It will round the sleep time every frame, so it will sync to 63 FPS too. Same problem as the LWJGL Display.sync().
The stuttering that you see is due to two frames being submitted before a screen refresh can take place, meaning that one frame is lost. By having 63 FPS, you’re dropping about every 20th frame. Result: visible but not “measurable” stuttering.
Why does this happen at 60FPS? Already explained that above, but here we go again. You have literally no control at all over when a frame is actually completed due to the command buffer, so there is no way to prevent the core cause of this problem. The command queue can potentially cause frames to be completed at irregular intervals, effectively affecting the chance a frame has to actually make it to the monitor before being overwritten by next one.
So even if you submit your commands perfectly synced to the screen refresh rate (exactly 60 commands per second) there is no guarantee that this will sync up with the screen refresh rate after going through the command buffer, the actual rendering process and then over to the screen. There are many things that can affect how long it takes for a frame to pass through all these stages.
The driver thread is obviously sleeping while you’re filling the command buffer. It’s affected by the same timing problems as our own Java threads.
Even if the driver is woken up at the exact same time, there might be other programs running that are using your CPU, causing commands to be delayed.
Even your graphics card is a shared medium. Everything that’s drawn on your screen goes through your graphics card, so why expect it to process the commands of your program immediately? We’re talking pretty heavy context switches, delays in sending the data to the graphics card, delays in rendering the data due to other load, e.t.c.
Now to the funniest part: Why do people expect their monitors to have billion dollar atomic watches built in to their monitors? It’s not that your monitor has a more precise clock than your computer. Rather, on the contrary I’d expect a cheap monitor to have a pretty inaccurate clock, possible running too fast or too slow, or even having heavy jitter. To ensure that no frames are lost, you obviously shouldn’t synchronize to the same values as your monitor tries to synchronize to. You should synchronize to the monitor itself, and and all the syncing flaws it comes with. I’d guess that a monitor might update at between 59.5 and 60.5 FPS, giving you a lost or duplicated frame every other second.
Why are you guys so sensitive to this? I mean, your TVs have refresh rates of 60FPS, but movies are stored in 23.976 FPS. How can your screen handle that. IT CAN’T. It simply shows the same frame for either 2 or 3 updates, to achieve somewhat smooth movement. I bet you didn’t even know that, and have never been bothered by it.
Things that won’t solve this kind of stuttering:
Better sleep precision than what we have right now.
Newer more advanced game loops and delta handling than those in the game loop thread.
Syncing everything perfectly to 60 Hertz.
If you try any of these, you’re syncing on the wrong side of the graphics card.
There is no way of completely getting rid of this stuttering. It is simply put an inherent flaw of computers running more than one process and one thread, your graphics card having a command buffer, your graphics card being used by more than one thread at a time and your monitor being imperfect, plus lots of other things.
VSync helps though. Your computer tries to supply a single frame for each screen update. However, you can still “miss a train” and get a duplicated frame. The time it takes for a frame to go from rendering commands to being displayed on your monitor is so susceptible to factors beyond your control that you’re sometimes bound to lose a frame or two no matter how good you sync your game loop.
Obviously I can see the stuttering in the bouncing box demo posted by Cero. The reason that it is so easy to see is because you’re moving the box one pixel per update, making it easy to see when one pixel is skipped. In a game, how many things move at a constant 60 pixels/second speed? For your information, running this with a green box on a red background will produce glaringly obvious artifacts on some very expensive monitors due to how they work. They simply assume that nothing you draw on it is going to be moving at 1 pixel per update constantly with the colors red and green.
Another better solution is to just render at an as high FPS as possible. If you’re rendering at 120 FPS and a two frames are overwritten instead of just 1, it will only be a half as big difference compared to if you were rendering at 60 FPS and dropped a frame. It’s much less visible, try it with any somewhat decent sync code (Display.sync(), the Sync class posted above, e.t.c). The higher your FPS, the less visible your stutter is. If you want to get rid of tearing too (which gets worse with higher FPS), use triple buffering. Remember that VSync and triple buffering can be forced in your drivers, and it should kick in for OpenGL accelerated Java2D too.
So drop it. You can’t fix this completely. I’m so sick of people whining about a pixel here and there. It’s not like your game will get bad reviews for this kind of stuttering. Commercial games have the exact same problem. This is not in any way related to Java or your choice of operating system. It’s quite simply the way it is.
In response to all the recent responses. Please stop. It’s sad to watch.
Do not use interpolation. The only way you will get apparently smooth motion is a precisely fixed logic rate tied to a precisely fixed frame update rate.
If you get stuttering with vsync turned on, then the problem is not your code, nor LWJGL, but Windows.
Yea thats not really the issue right here. My game uses fixed timesteps aswell - hasn’t really to do with the rendering - its only to compensate the logic if the game lags
Yeah, didn’t seem to make any difference - I mean with Vsync enabled I have to scroll around for a number of seconds anyway to get one stutter, but I got one with this line removed too
Although - without Vsync, it is kinda bad. Stuttering a little every second
@theagentd - not everyone uses LWJGL for games. The whole reason it came about in fact in the dim distant past is that I wrote a library to do television graphics output which needed graphics at a rock steady 60Hz. This I achieved; however, I rely on actual vsynced fullscreen displays, which is the only cast iron way to do it right. Also it turns out a surprising number of objects in 2D games run at very precise unchanging velocities and they look jarring to people used to the slick smooth professional feel of console graphics and anyone brought up on home computers in the 80s.
Sadly you’re right about the Sync class I posted, it will creep out of sync due to rounding, which is what LWJGL attempts to fix in its Display.sync() but fails. I don’t use this loop myself I instead use a method that counts time over a big number of frames, resetting every now and again. But again, this is only a backup, to ensure that the game doesn’t race if vsync fails.