System.nanoTime() versus System.currentTimeMillis()

I’ve read all sorts of thing on the internet about how innacurate currentTimeMillis() is on Windows, so I thought I would do on experiment on Windows 7.

I wrote a class that sleeps for less than the putative accuracy of Windows and timed the sleep in nanos and millis. Then I put the ratio of the results (expressed as a percentage) in a table.

`
public class TryItOut {

public static void main(String[] args) throws InterruptedException {
    TreeMap<Integer, Integer> multiCounter = new TreeMap<Integer, Integer>();
    for (int i = 0; i < 500; i++) {
        long nowNano = System.nanoTime();
        long nowMilli = System.currentTimeMillis();
        Thread.sleep(10L);
        long nano = System.nanoTime() - nowNano;
        long milli = System.currentTimeMillis() - nowMilli;
        int diff = (int) (nano / milli / 10000);
        Integer count = multiCounter.get(diff) == null ? 1 : multiCounter.get(diff) + 1;
        multiCounter.put(diff,  count);
    }
    for (Map.Entry<Integer, Integer> sortedEntry : multiCounter.entrySet()) {
        System.out.println(sortedEntry.getKey() + ":" + sortedEntry.getValue());
    }
}

}
`

Now, before someone points out that Thread.sleep() is itself inaccurate, this is discounted by my loop as I look at the proportional (not the absolute) difference between nanos and millis.

Just to be clear, I am not measuring the accuracy of the timers but only the degree to which the timers agree with each other.

These are the results for a 10ms sleep:

91:2 94:1 96:2 97:2 98:6 99:394 100:87 101:3 103:2 106:1

I would say currentTimeMillis() aint that bad!

The spread of course gets wider as you go for smaller sleeps. This is for 5ms sleeps:

82:1 84:1 85:1 88:1 89:1 90:1 91:1 94:2 95:3 96:2 97:10 98:39 99:352 100:59 101:16 102:5 106:1 109:2 110:1 116:1

…but for 60FPS (17ms) the results get quite tight:

94:2 95:5 96:1 97:3 98:19 99:360 100:84 101:17 102:2 103:5 104:2

My benchmark app uses System.currentTimeMillis() because it measures large timescales of at least a second. The difference between nanos and millis for one second intervals is:

99:149 100:351

EDIT: At the risk of belaboring the point, I ran the 1sec test again, with the ratio measured in thousandths:

998 : 5 999 : 113 1000 : 382

You’re measuring the precision of system timers using the very system timers that are alleged to be imprecise. You could see how this might present a problem? Problem is, getting a high frequency timer that is precise will “fix” the system timer for that process, a sort of Heisenbug if I’ve ever seen one.

Measuring cumulative error, which manifests as missed frames, might be more accurate. I lack the math chops to remember exactly how to measure that sort of thing though :stuck_out_tongue:

I’m just using Thread.sleep() to cause a certain delay, which then gets measured using currentTimeMillis() and nanoTime(). Then I look at the difference between the two measures proportionally. So I don’t see the problem you see.

Of course, if both measures are inaccurate in the same directions at the same time, my loop would see nothing. But that is not relevant - I am only interested in the question of how divergent the results of nanoTime() and currentTimeMillis() are.


            System.out.println(milli);
            int diff = (int) (nano / milli / 10000);
/*
16
15
16
0
Exception in thread "main" java.lang.ArithmeticException: / by zero
	at Main.TryItOut.main(TryItOut.java:18)
*/

same win XP :wink:

the test in the OP doesn’t seem useful or meaningful.

IRIC, use nanoTime to measure time deltas more precisely, and currentTimeMillis to get a more accurate calendar time stamp value.

If you are benchmarking code, consider one of the microbenchmark frameworks such as jmh: http://openjdk.java.net/projects/code-tools/jmh/.

You could try this to force a high precision timer on XP:

@gene9 the test is meaningful in the sense that it clearly demonstrates the accuracy of currentTimeMillis() on Win7 is better than 16ms (assuming that nanoTime is more accurate than that), which I did not know. I agree it does not tell you anything about the actual accuracy of either timer, but that ws not the question I was asking!

Although… using a sleep forces a high resolution timer in any case (as long as the sleep is longer tha the resolution of the regular timer), so I had better find a different way of forcing a delay. But at least I know that Win7 has a high res timer by default.