Sys class issue?

Hi,

I have upgraded from 0.95 to 0.98-1 yesterday and without any code change my frame rate dropped to 48-49 fps and my game was running at synchronized 60 fps with version 0.95. I use the Sys class to handle fps and for animation timing. I’ve read on the LWJGL forums (http://lwjgl.org/forum/viewtopic.php?t=1217&highlight=sys) that Sys.getTimerResolution() seems to always return 1000 on Windows. Do you think it might be related to my problem? It seems that something has been broken after 0.95 in the Sys class.

there was a timing problems with i think lwjgl 0.96 and before, but it has been fixed now the best way to cap the game at a certain fps now is if you use

Display.sync(60);

maybe try that instead of the timer ur using atm.

I you use Display.sync(), how do you handle movement timing if the fps drops below 60? If its 50 then all the characters moves must adapt to that fps if all the moves are time based (pixels/second). For example if my character moves at 200 pixels per second then how do you adapt the character move if the fps drops below 60?

Using tick based movements also means you movement slows down if the FPS is low.
This is usually not a problem, but if it is you should use time based movement instead.

Matzon,

according to your answer do you think I need to replace Sys usage for lwjgl.util.Timer or GAGE timer?

But isn’t getTimeResolution() whats used to convert between “tick” based movement to “time” based movement?

Kev

Correct but if this method always returns 1000 under Windows it’s not accurate at all based on the CPU clock speed.

well if its time based using delta, movement shouldn’t be effected even if the fps goes below or above 60.

Oh btw have a look at the SystemTimer.getTime() class included in the space invaders demo with lwjgl it handles the issue very nicely.

Absolutely, thats why I was surprised when Matzon referred to tick based being the problem. If Windows always returns a 1000 thats not a problem - 1000 ticks per second, meaning that each tick happens to be a millisecond on windows (unless of course you wanted submillisecond timing!)

Kev

Sorry but I don’t unerstand your reasoning. Sys.getTimerResolution used to return the actual CPU ticks/sec and now whatever the CPU the game is running on it always returns 1000??? How such a value can be 100% accurate? Please explain me this.

About the referred SystemTimer class, it wraps the GAGE timer actually so there is no relationship with LWJGL Sys class. My goal is to use only LWJGL and no other third party lib.

getTimeResolution just refers to the resolution of the clock, which in the case of windows is 1000. It was previously a cpu dependent value but now it’s a multimedia timer since there are BIG issues with using the old high performance timer.

Slightly confused here… - but I was answering to: “how do you handle movement timing if the fps drops below 60?”
No matter what timer you use, you will still have problems if you based your movement upon fps since slower fps = slower movement.
to solve that issue, you will have to use a time based movement instead, that is x units per y time elapsed.

Movement in my game is already time based. I just upgraded to version 0.98 and now I have 48-49 fps max instead of capped 60. I still don’t understand how the change in Sys class can work as before. What kind of change do I need to do in my code?

if it is time based, I dont understand why you have issues with movement??
if you move x pixels given a time period, you should at the end of that period have moved just as far regardless if you’re getting 60 or 0.001 FPS

As to why the framerate has dropped it might be because of some internal lwjgl changes - it would have to be benchmarked to identify

Below is what I do in my rendering loop. As you can see, there is a dependency on Sys.getTimerResolution() to calculate the actual time of the current frame. If Sys.getTimerResolution() used to return 1400 for example (don’t remember the exact value) before 0.98 than that should explain the reason of the fps drop do you agree?

Maybe there is something wrong in it???


    // Initializes rendering loop variables.
    frameCount = 0;
    long fpsCountTicks = 0;
    long actualFrameTicks = 0;
    long ticksPerSecond = Sys.getTimerResolution();
    frameTicks = ticksPerSecond / maxFps;
    
    startFrameTicks = Sys.getTime();
    
    Graphics2D g;
    running = true;
    while (alive) {
      // Make the Graphics2D object available before rendering.
      g = (Graphics2D) bufferStrategy.getDrawGraphics();
      Game2DContext.setGraphics(g);

      // Render all animation stuff.
      animation.render();
      if (animation.isCompleted()) {
        break;
      }

      // Paint the frame and render it.
      animation.paint();
      g.dispose();

      // Put the offscreen image to the screen.
      bufferStrategy.show();

      // Used to calculate fps.
      frameCount++;

      actualFrameTicks = Sys.getTime() - startFrameTicks;
      if (actualFrameTicks < frameTicks) {
        while (Sys.getTime() - startFrameTicks < frameTicks) {
          Thread.yield();
        }
        actualFrameTicks = Sys.getTime() - startFrameTicks;
        startFrameTicks = Sys.getTime();
      } else {
        startFrameTicks = Sys.getTime();
        Thread.yield();
      }

      // Update the current frame duration.  This will allow animations to calculate movements accordingly.
      Game2DContext.setFrameDuration(((float) actualFrameTicks) / ticksPerSecond);

      // Do the frame rate calculations.
      fpsCountTicks += actualFrameTicks;
      if (fpsCountTicks >= ticksPerSecond) {
        fps = frameCount;
        Game2DContext.setFps(fps);
        frameCount = 0;
        fpsCountTicks = 0;
      }
    }


I don’t think so. “frameTicks” is based on the resolution, so it will always be the number of ticks occuring in a second.

Kev

actually the SystemTimer class that comes with LWJGL doesn’t use the GAGE timer!
http://cvs.sourceforge.net/viewcvs.py/java-game-lib/LWJGL/src/java/org/lwjgl/examples/spaceinvaders/

Do I have to put my glasses because I don’t see a class called SystemTimer in the URL you gave me. The version of this class I checked was on Kevin’s website. It embed an AdvancedTimer object, which is the GAGE timer. I guess that this version is not up to date to the latest LWJGL version.

whoops could have sworn it was there, ok maybe i’m just getting old but here a quick copy paste

public class SystemTimer {

	/** The number of "timer ticks" per second */

	private static long timerTicksPerSecond;

	

	/** A little initialisation at startup, we're just going to get the timer going */

	static {

		timerTicksPerSecond		= Sys.getTimerResolution();

	}

	

	/**

	 * Get the high resolution time in milliseconds

	 * 

	 * @return The high resolution time in milliseconds

	 */

	public static long getTime() {

		// we get the "timer ticks" from the high resolution timer

		// multiply by 1000 so our end result is in milliseconds

		// then divide by the number of ticks in a second giving

		// us a nice clear time in milliseconds

		return (Sys.getTime() * 1000) / timerTicksPerSecond;

	}
}

then in the main game loop u can do

long delta = SystemTimer.getTime() - lastLoopTime;
		lastLoopTime = SystemTimer.getTime();

to get delta, which u can use with ur game entities

I’m not sure here what the confusion is all about??? You don’t really even need to know what Sys.getTimerResolution() returns. You just need to look at the difference between now and then and divide by the resolution and that’s the time that’s passed between now and then in seconds. If your framerate is lower suddenly then it’s nothing to do with the timer at all.

Cas :slight_smile:

And what do you think about what I posted previously: