I’ve been experimenting with determining the phase & period of the vsync signal (by timing a tight redraw loop in fullscreen exclusive mode),
with the idea of using this information to accurately time repaints in windowed mode to minimize or even eliminate tearing.
Timing the phase & period of the vsync has been easy enough, but… (yes, the inevitable but…
The actual refresh rates i’m measuring do not (quite) correspond with the refresh-rate the display is supposed to be operating at (according to Windows).
I’m using an hdtv as my monitor, and these are the values i’m observing for the different refresh rates my screen supports.
Reported : Observed
24Hz : 23.974257Hz -> 23.97Hz ?
25Hz : 24.99952Hz -> 25Hz
30Hz : Barfs
50Hz : 49.99325Hz -> 50Hz
60Hz : 59.93889Hz -> 59.94Hz ?
I realize that the way in which i’m collecting the timings (System.nanoTime()) has it’s own error margins involved.
However, the results clearly point to the fact that for 24Hz & 60Hz the drivers are reporting my refresh rate to be one value - where-as in reality it is something different.
Another problem is that i’m observing drift in the results, now my understanding of the vsync signal is that it’s regular as clockwork. Is this a false assumption?
Or is it, as I suspect System.nanoTime() that is drifting. (though, it could be something wrong with my code too…)
It’s really the drift that’s stopping this experiment from getting any further; but the false reporting of the refresh-rate intrigued me enough
to see if anyone else had encountered, or even considered that the OS might be lying to you! ;D
:EDIT:
I might hook up one of my 19" monitors & see what that reports for ‘60Hz’.