How does getTimeInMillis() work??

I’m trying to get my game to render properly without refresh line. But the redering code I’ve come across all seem to need getTimeInMillis. So my question is how do you use getTimeInMillis, or if you know of a way to render without getTime please tell me. Thank you.

p.s. sorry if this was asked I’m new and I couldn’t find it using search.

I’m newbee too .
Really don’t understand clearly but Java has
System.getCurrentTimeMillis() if you need.

True though its accuracy is bad under Win98

Most game use GAGE totday. You can google for it.

JDK1.5 has the nanosecond timer which is gauranteed to have a good accuracy.

[quote]True though its accuracy is bad under Win98

Most game use GAGE totday. You can google for it.

JDK1.5 has the nanosecond timer which is gauranteed to have a good accuracy.
[/quote]
Ack! Poor guy! He’ll be trying to figure out what N-GAGE has to do with anything.

The GAGETimer API can be found at http://java.dnsalias.com

Thanks JB.

jk

http://java.dnsalias.com:

[quote]GAGETimer has been updated to support the Java 1.5 System.nanoTime() method. The nanotimer is the default timer under 1.5 VMs. Previous Java versions will continue to use System.currentTimeMillis() or the native Windows timer.
[/quote]
Is the “native Windows timer” more accurate than the Java currentTimeMillis() timer? I thought currentTimeMillis() was native…

The currentTimeMillis() timer only ticks 20-100 times per second on Windows. This is far too slow for a high performance game. Thus I added a DLL to GAGETimer that allowed it to access the Windows Hi-res timer which ticks ~30,000 times per second. This is what most people have been using. 1.5 has added a new timer (nanotimer) that ticks some ungodly number of times per second. This is now the default if you’re using 1.5. ;D

[quote]http://java.dnsalias.com:
Is the “native Windows timer” more accurate than the Java currentTimeMillis() timer? I thought currentTimeMillis() was native…
[/quote]
There are multiple tiemrs under Win32.

currentTimeMillis uses the standard system timer, its accuracy is bad (under NT) to horrible (under 98).

There is another timer called the multimedia timer but there are Win32 performance reasons why the standard
currentTimeMillis() doesn’t use it.

If you are under 1.5 use the nanosecond timer.
Otherwise use GAGE :slight_smile:

But don’t use it on a game server :), client side fine, but not on the server, ~60% CPU usage v.s. 0.1% :slight_smile:

No idea why, but that was the only change :slight_smile:

Endolf

Whelp i did say there were performance reasons for using the lower accuracy timer under Win32 .

Otehrwise they wouldve just changed the timer’s implementation instead of creating a whole new entry point.

That was under linux, so that probably makes a huge difference, as I’m not sure how gage handles linux

[quote]That was under linux, so that probably makes a huge difference, as I’m not sure how gage handles linux
[/quote]
Depends on how you use it. GAGE is built to be the most accurate and precise timer for games possible with current Java technology. If you’re going to use it for a game server, make sure you don’t use the sleep methods! These methods are designed to spin on sleep, and will degrade the performance of a multithreaded server environment. If you just want to know the current time, GAGE should still work fine.

What were you using it for on your server?

[quote]…make sure you don’t use the sleep methods! These methods are designed to spin on sleep, and will degrade the performance of a multithreaded server environment.
[/quote]
What does it mean for a method to “spin on” sleep (or any other method) ???
This may seem obvious to most, so sorry :-[ … keep in mind I’m a newbie.

[quote]What were you using it for on your server?
[/quote]
I’m just the BOFH in this case, but I think it was a sleepUntil.

Endolf

Guess, I’ll have to own up… it was the MA server and it was indeed using sleepUntil. I had assumed on linux that it would just use a Thread.sleep(), hence my mistake.

Spinning on the sleep means that it constantly loops round on a very small pause, using up lots of processor time…

My mistake has been rectified :slight_smile:

Kev


public void sleepUntil(long time)
{
    while(System.currentTimeMillis() < time) Thread.yield();
}

By yielding instead of sleeping, the method will return almost to the moment the clock flips over to match the “time” variable. The only problem is that it will use 100% of the CPU to do it. Of course, this isn’t a big deal as the method is actually giving up its time to other processes, but anything that the other processes don’t use will come right back to that loop.

This method is particularly advantageous when dealing with the System.currentTimeMillis() timer. For example, if I’m deploying my game on Windows 2000, System.currentTimeMillis() gives a tick rate of 100 ticks per second. If we use Thread.sleep(), we have no idea how many milliseconds we have until the next tick. This means that the framerate may jump all over the place as we’ll have no idea how often we need to sleep, or for how long. But if we latch onto the digital wave generated by the clock, we can always guarantee exactly 10ms to render each frame. Extra time can be slept off by waiting for the next clock tick.

Here’s a graph to demonstrate:


--    --    --
  |__|  |__|

  ^  ^  ^  ^
  |  |  |  |
begin rendering

Make sense?

[quote]Guess, I’ll have to own up… it was the MA server and it was indeed using sleepUntil. I had assumed on linux that it would just use a Thread.sleep(), hence my mistake. … My mistake has been rectified :slight_smile:
[/quote]
All’s well that ends well, right? ;D